Skip to content

Commit

Permalink
removing space
Browse files Browse the repository at this point in the history
Signed-off-by: Damien Burks <damien@hackintosh.attlocal.net>

website/load-docs.sh: add versioned docs for latest version (open-policy-agent#4222)

With this change, we should get both a /docs/v0.36.1 and a /docs/latest
in our deployed docs.

Before, the symlink would have made hugo only build a /docs/latest,
but not the other link.

Signed-off-by: Stephan Renatus <stephan.renatus@gmail.com>

removing -l flag from the test subcommand

Adding additional space to sign off on commit after removing -l from test.go

Signed-off-by: Damien Burks <damien@hackintosh.attlocal.net>

removing space

Signed-off-by: Damien Burks <damien@hackintosh.attlocal.net>

Add Dapr integration (open-policy-agent#4229)

Also included some improvements to the Rego checks:

* Pass GITHUB_TOKEN to policy to not exceed API quota
* Ensure required attributes included in integration
* Ensure commited .json files are valid JSON

Signed-off-by: Anders Eknert <anders@eknert.com>

Follow redirects in http.send Rego checks (open-policy-agent#4232)

Signed-off-by: Anders Eknert <anders@eknert.com>

fixing broken links in the online documentation (open-policy-agent#4224)

Signed-off-by: Peter Helewski <phelewski@gmail.com>

plugins/bundle: Update persisted bundle activation mechanism

Earlier errors encountered during loading and activating persisted
bundles would cause the OPA runtime to exit. This behavior is different
from when OPA downloads a bundle and activation errors if any would possibly
get resolved in successive download attempts. This fix adds a retry mechanism
to activate persisted bundles in an attempt to mimic the behavior seen during
bundle downloads. Errors if any encountered during the process will be
surfaced in the bundle's status update and not result in an abrupt exit.

Fixes: open-policy-agent#3840

Signed-off-by: Ashutosh Narkar <anarkar4387@gmail.com>

built-ins: add graph.reachable_paths (open-policy-agent#4205)

This new built-in functionality allows callers to find all reachable
paths in a graph based on an array or set of root nodes. See the
updates to policy-reference.md for more details and usage information.

Signed-off-by: Justin Lindh <justin.lindh@webfilings.com>

ast/parser: don't error when future kw import is redundant (open-policy-agent#4237)

When passing `ParserOptions{AllFutureKeywords: true}` or
`ParserOptions{FutureKeywords: []string{"in"}}` to the `ast` package's
parse methods, and when parsing a module that contains

    import future.keywords.in

then we would previously have raised an error: when the parser
encountered the `in`, it complains about not expecting an "in" token
there.

Now, parsing imports is done with a scanner copy that knows nothing
about the future keywords.

Signed-off-by: Stephan Renatus <stephan.renatus@gmail.com>

ast: Adding duplicate imports check to compiler strict mode (open-policy-agent#4228)

When strict mode is enabled, an import shadowing another import is an error.

Fixes: open-policy-agent#2698
Signed-off-by: Johan Fylling <johan.dev@fylling.se>

ast: add `every` future keyword (parser, internal representation) (open-policy-agent#4179)

* ast: add 'every' future keyword, parser support, scaffolding

With this commit, "every x in xs { ... }" and "every k, v in { ... }"
will be parsed into a new struct, which is basically

    Every {
      Key, Value *Term
      Domain *Term
      Body Body
    }

This includes the required to changes to visitors, comparisons, copy, ...
all the ceremony required to introduce a new keyword.

* format: format every statements

* ast: hide 'every' from capabilities for now

With this change,

- capabilities.json will NOT mention "every"
- "import future.keywords" will NOT get you "every"
- "import future.keywords.every" will complain about "every" being unknown

In tests, we're passing an unexported field in `ast.ParserOptions`,
which makes the parser treat "every" like it would eventually be
treated when unveiled.

The formatting tests are bluntly SKIPPED for now, to be re-enabled later.

---------------
NB: The work on "every" is ongoing.
Rewriting and evaluation follows. When it's documented, we'll unhide it.
---------------

Signed-off-by: Stephan Renatus <stephan.renatus@gmail.com>

bundle: Roundtrip manifest before hashing

When OPA verifies the content of the manifest file,
it first parses it into a JSON structure and then recursively orders
the fields of all objects alphabetically and then applies the
hash function. The same process was not followed while generating
the hash for the manifest content which would result in a digest
mismatch during verification. This can be observed with a manifest
that contains metadata.

Fixes: open-policy-agent#4233

Signed-off-by: Ashutosh Narkar <anarkar4387@gmail.com>

Removing more deprecated code related to failureLine

Signed-off-by: Damien Burks <damien@hackintosh.attlocal.net>
  • Loading branch information
Damien Burks authored and damienjburks committed Jan 19, 2022
1 parent bc3e610 commit fbdeb47
Show file tree
Hide file tree
Showing 54 changed files with 1,359 additions and 224 deletions.
2 changes: 2 additions & 0 deletions .github/workflows/pull-request.yaml
Expand Up @@ -294,4 +294,6 @@ jobs:
curl --silent --fail --header 'Authorization: Bearer ${{ secrets.GITHUB_TOKEN }}' \
https://api.github.com/repos/${{ github.repository }}/pulls/${{ github.event.pull_request.number }}/files \
| opa eval --bundle build/policy/ --format values --stdin-input --fail-defined 'data.files.deny[message]'
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

4 changes: 2 additions & 2 deletions CHANGELOG.md
Expand Up @@ -1485,7 +1485,7 @@ more information see https://openpolicyagent.org/docs/latest/privacy/.
#### New `opa build` command

The `opa build` command can now be used to package OPA policy and data files
into [bundles](https://www.openpolicyagent.org/docs/latest/management/#bundles)
into [bundles](https://www.openpolicyagent.org/docs/latest/management-bundles)
that can be easily distributed via HTTP. See `opa build --help` for details.
This change is backwards incompatible. If you were previously relying on `opa
build` to compile policies to wasm, you can still do so:
Expand Down Expand Up @@ -2420,7 +2420,7 @@ pass `"force_json_decode": true` as in the `http.send` parameters.
* This release adds support for scoping bundles to specific roots
under `data`. This allows bundles to be used in conjunction with
sidecars like `kube-mgmt` that load local data and policy into
OPA. See the [Bundles](https://www.openpolicyagent.org/docs/bundles.html)
OPA. See the [Bundles](https://www.openpolicyagent.org/docs/latest/management-bundles)
page for more details.

* This release includes a small but backwards incompatible change to
Expand Down
21 changes: 21 additions & 0 deletions ast/builtins.go
Expand Up @@ -211,6 +211,7 @@ var DefaultBuiltins = [...]*Builtin{
// Graphs
WalkBuiltin,
ReachableBuiltin,
ReachablePathsBuiltin,

// Sort
Sort,
Expand Down Expand Up @@ -2007,6 +2008,26 @@ var ReachableBuiltin = &Builtin{
),
}

// ReachablePathsBuiltin computes the set of reachable paths in the graph from a set
// of starting nodes.
var ReachablePathsBuiltin = &Builtin{
Name: "graph.reachable_paths",
Decl: types.NewFunction(
types.Args(
types.NewObject(
nil,
types.NewDynamicProperty(
types.A,
types.NewAny(
types.NewSet(types.A),
types.NewArray(nil, types.A)),
)),
types.NewAny(types.NewSet(types.A), types.NewArray(nil, types.A)),
),
types.NewSet(types.NewArray(nil, types.A)),
),
}

/**
* Sorting
*/
Expand Down
3 changes: 3 additions & 0 deletions ast/capabilities.go
Expand Up @@ -50,6 +50,9 @@ func CapabilitiesForThisVersion() *Capabilities {
})

for kw := range futureKeywords {
if kw == "every" { // TODO(sr): drop when ready
continue
}
f.FutureKeywords = append(f.FutureKeywords, kw)
}
sort.Strings(f.FutureKeywords)
Expand Down
5 changes: 5 additions & 0 deletions ast/compare.go
Expand Up @@ -201,6 +201,9 @@ func Compare(a, b interface{}) int {
case *SomeDecl:
b := b.(*SomeDecl)
return a.Compare(b)
case *Every:
b := b.(*Every)
return a.Compare(b)
case *With:
b := b.(*With)
return a.Compare(b)
Expand Down Expand Up @@ -272,6 +275,8 @@ func sortOrder(x interface{}) int {
return 100
case *SomeDecl:
return 101
case *Every:
return 102
case *With:
return 110
case *Head:
Expand Down
28 changes: 28 additions & 0 deletions ast/compile.go
Expand Up @@ -109,6 +109,7 @@ type Compiler struct {
debug debug.Debug // emits debug information produced during compilation
schemaSet *SchemaSet // user-supplied schemas for input and data documents
inputType types.Type // global input type retrieved from schema set
strict bool // enforce strict compilation checks
}

// CompilerStage defines the interface for stages in the compiler.
Expand Down Expand Up @@ -254,6 +255,7 @@ func NewCompiler() *Compiler {
metricName string
f func()
}{
{"CheckDuplicateImports", "compile_stage_check_duplicate_imports", c.checkDuplicateImports},
// Reference resolution should run first as it may be used to lazily
// load additional modules. If any stages run before resolution, they
// need to be re-run after resolution.
Expand Down Expand Up @@ -367,6 +369,12 @@ func (c *Compiler) WithUnsafeBuiltins(unsafeBuiltins map[string]struct{}) *Compi
return c
}

// WithStrict enables strict mode in the compiler.
func (c *Compiler) WithStrict(strict bool) *Compiler {
c.strict = strict
return c
}

// QueryCompiler returns a new QueryCompiler object.
func (c *Compiler) QueryCompiler() QueryCompiler {
c.init()
Expand Down Expand Up @@ -1277,6 +1285,26 @@ func (c *Compiler) getExports() *util.HashMap {
return rules
}

func (c *Compiler) checkDuplicateImports() {
if !c.strict {
return
}

for _, name := range c.sorted {
mod := c.Modules[name]
processedImports := map[Var]*Import{}

for _, imp := range mod.Imports {
name := imp.Name()
if processed, conflict := processedImports[name]; conflict {
c.err(NewError(CompileErr, imp.Location, "import must not shadow %v", processed))
} else {
processedImports[name] = imp
}
}
}
}

// resolveAllRefs resolves references in expressions to their fully qualified values.
//
// For instance, given the following module:
Expand Down
116 changes: 97 additions & 19 deletions ast/compile_test.go
Expand Up @@ -1454,25 +1454,7 @@ func TestCompilerRewriteExprTerms(t *testing.T) {
t.Fatalf("Expected modules to be equal. Expected:\n\n%v\n\nGot:\n\n%v", expected, compiler.Modules["test"])
}
case Errors:
if len(exp) != len(compiler.Errors) {
t.Fatalf("Expected %d errors, got %d:\n\n%s\n", len(exp), len(compiler.Errors), compiler.Errors.Error())
}
incorrectErrs := false
for _, e := range exp {
found := false
for _, actual := range compiler.Errors {
if e.Message == actual.Message {
found = true
break
}
}
if !found {
incorrectErrs = true
}
}
if incorrectErrs {
t.Fatalf("Expected errors:\n\n%s\n\nGot:\n\n%s\n", exp.Error(), compiler.Errors.Error())
}
assertErrors(t, compiler.Errors, exp, false)
default:
t.Fatalf("Unsupported value type for test case 'expected' field: %v", exp)
}
Expand All @@ -1481,6 +1463,102 @@ func TestCompilerRewriteExprTerms(t *testing.T) {
}
}

func TestCompilerCheckDuplicateImports(t *testing.T) {
cases := []struct {
note string
module string
expectedErrors Errors
strict bool
}{
{
note: "shadow",
module: `package test
import input.noconflict
import input.foo
import data.foo
import data.bar.foo
`,
expectedErrors: Errors{
&Error{
Location: NewLocation([]byte("import"), "", 4, 5),
Message: "import must not shadow import input.foo",
},
&Error{
Location: NewLocation([]byte("import"), "", 5, 5),
Message: "import must not shadow import input.foo",
},
},
strict: true,
}, {
note: "alias shadow",
module: `package test
import input.noconflict
import input.foo
import input.bar as foo
`,
expectedErrors: Errors{
&Error{
Location: NewLocation([]byte("import"), "", 4, 5),
Message: "import must not shadow import input.foo",
},
},
strict: true,
}, {
note: "no strict",
module: `package test
import input.noconflict
import input.foo
import data.foo
import data.bar.foo
import input.bar as foo
`,
strict: false,
},
}

for _, tc := range cases {
t.Run(tc.note, func(t *testing.T) {
compiler := NewCompiler().WithStrict(tc.strict)
compiler.Modules = map[string]*Module{
"test": MustParseModule(tc.module),
}

compileStages(compiler, compiler.checkDuplicateImports)

if len(tc.expectedErrors) > 0 {
assertErrors(t, compiler.Errors, tc.expectedErrors, true)
} else {
assertNotFailed(t, compiler)
}
})
}
}

func assertErrors(t *testing.T, actual Errors, expected Errors, assertLocation bool) {
t.Helper()
if len(expected) != len(actual) {
t.Fatalf("Expected %d errors, got %d:\n\n%s\n", len(expected), len(actual), actual.Error())
}
incorrectErrs := false
for _, e := range expected {
found := false
for _, actual := range actual {
if e.Message == actual.Message {
if !assertLocation || e.Location.Equal(actual.Location) {
found = true
break
}
}
}
if !found {
incorrectErrs = true
}
}
if incorrectErrs {
t.Fatalf("Expected errors:\n\n%s\n\nGot:\n\n%s\n", expected.Error(), actual.Error())
}
}

func TestCompilerResolveAllRefs(t *testing.T) {
c := NewCompiler()
c.Modules = getCompilerTestModules()
Expand Down
15 changes: 15 additions & 0 deletions ast/internal/scanner/scanner.go
Expand Up @@ -112,6 +112,21 @@ func (s *Scanner) WithKeywords(kws map[string]tokens.Token) *Scanner {
return &cpy
}

// WithoutKeywords returns a new copy of the Scanner struct `s`, with the
// set of known keywords being that of `s` with `kws` removed.
// The previously known keywords are returned for a convenient reset.
func (s *Scanner) WithoutKeywords(kws map[string]tokens.Token) (*Scanner, map[string]tokens.Token) {
cpy := *s
kw := s.keywords
cpy.keywords = make(map[string]tokens.Token, len(s.keywords)-len(kws))
for kw, tok := range s.keywords {
if _, ok := kws[kw]; !ok {
cpy.AddKeyword(kw, tok)
}
}
return &cpy, kw
}

// Scan will increment the scanners position in the source
// code until the next token is found. The token, starting position
// of the token, string literal, and any errors encountered are
Expand Down
3 changes: 3 additions & 0 deletions ast/internal/tokens/tokens.go
Expand Up @@ -65,6 +65,8 @@ const (
Lte
Dot
Semicolon

Every
)

var strings = [...]string{
Expand Down Expand Up @@ -112,6 +114,7 @@ var strings = [...]string{
Lte: "lte",
Dot: ".",
Semicolon: ";",
Every: "every",
}

var keywords = map[string]Token{
Expand Down

0 comments on commit fbdeb47

Please sign in to comment.