Compare commits
30 Commits
970b5697bd
...
0.1.0
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
0ffcecf19e | ||
|
|
761cf83b8e | ||
|
|
7caf3a5da5 | ||
|
|
94ee7e4880 | ||
|
|
ce9b68200e | ||
|
|
e3c84fcf60 | ||
|
|
4a515fd6c3 | ||
|
|
14cba1b530 | ||
|
|
462c832651 | ||
|
|
77fe3e9cac | ||
|
|
0ee44c0a27 | ||
|
|
d450d358b4 | ||
|
|
2cdcfe2812 | ||
|
|
ef7729475a | ||
|
|
99bd5bffdd | ||
|
|
4379960835 | ||
|
|
2aeec1e5f6 | ||
|
|
5853365707 | ||
|
|
5c3f05a1a4 | ||
|
|
e2c87c90f3 | ||
|
|
1ea518a58a | ||
|
|
0654062d08 | ||
|
|
a88f833f49 | ||
|
|
b2e963fc04 | ||
|
|
8fe319de2d | ||
|
|
93d48bd3ed | ||
|
|
164dad896c | ||
|
|
f111bf1aaa | ||
|
|
4a624aa929 | ||
|
|
5b0834137b |
2
.gitignore
vendored
2
.gitignore
vendored
@@ -1,2 +1,4 @@
|
||||
build
|
||||
*.log
|
||||
mdt
|
||||
*.out
|
||||
|
||||
21
LICENSE
Normal file
21
LICENSE
Normal file
@@ -0,0 +1,21 @@
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2026 MARTe Community
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
24
Makefile
Normal file
24
Makefile
Normal file
@@ -0,0 +1,24 @@
|
||||
BINARY_NAME=mdt
|
||||
BUILD_DIR=build
|
||||
|
||||
.PHONY: all build test coverage clean install
|
||||
|
||||
all: test build
|
||||
|
||||
build:
|
||||
mkdir -p $(BUILD_DIR)
|
||||
go build -o $(BUILD_DIR)/$(BINARY_NAME) ./cmd/mdt
|
||||
|
||||
test:
|
||||
go test -v ./...
|
||||
|
||||
coverage:
|
||||
go test -cover -coverprofile=coverage.out ./test/... -coverpkg=./internal/...
|
||||
go tool cover -func=coverage.out
|
||||
|
||||
clean:
|
||||
rm -rf $(BUILD_DIR)
|
||||
rm -f coverage.out
|
||||
|
||||
install:
|
||||
go install ./cmd/mdt
|
||||
96
README.md
Normal file
96
README.md
Normal file
@@ -0,0 +1,96 @@
|
||||
# MARTe Development Tools (mdt)
|
||||
|
||||
`mdt` is a comprehensive toolkit for developing, validating, and building configurations for the MARTe real-time framework. It provides a CLI and a Language Server Protocol (LSP) server to enhance the development experience.
|
||||
|
||||
## Features
|
||||
|
||||
- **LSP Server**: Real-time syntax checking, validation, autocomplete, hover documentation, and navigation (Go to Definition/References).
|
||||
- **Builder**: Merges multiple configuration files into a single, ordered output file.
|
||||
- **Formatter**: Standardizes configuration file formatting.
|
||||
- **Validator**: Advanced semantic validation using [CUE](https://cuelang.org/) schemas, ensuring type safety and structural correctness.
|
||||
|
||||
## Installation
|
||||
|
||||
### From Source
|
||||
|
||||
Requirements: Go 1.21+
|
||||
|
||||
```bash
|
||||
go install github.com/marte-community/marte-dev-tools/cmd/mdt@latest
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
### CLI Commands
|
||||
|
||||
- **Check**: Run validation on a file or project.
|
||||
```bash
|
||||
mdt check path/to/project
|
||||
```
|
||||
- **Build**: Merge project files into a single output.
|
||||
```bash
|
||||
mdt build -o output.marte main.marte
|
||||
```
|
||||
- **Format**: Format configuration files.
|
||||
```bash
|
||||
mdt fmt path/to/file.marte
|
||||
```
|
||||
- **LSP**: Start the language server (used by editor plugins).
|
||||
```bash
|
||||
mdt lsp
|
||||
```
|
||||
|
||||
### Editor Integration
|
||||
|
||||
`mdt lsp` implements the Language Server Protocol. You can use it with any LSP-compatible editor (VS Code, Neovim, Emacs, etc.).
|
||||
|
||||
## MARTe Configuration
|
||||
|
||||
The tools support the MARTe configuration format with extended features:
|
||||
- **Objects**: `+Node = { Class = ... }`
|
||||
- **Signals**: `Signal = { Type = ... }`
|
||||
- **Namespaces**: `#package PROJECT.NODE` for organizing multi-file projects.
|
||||
|
||||
### Validation & Schema
|
||||
|
||||
Validation is fully schema-driven using CUE.
|
||||
|
||||
- **Built-in Schema**: Covers standard MARTe classes (`StateMachine`, `GAM`, `DataSource`, `RealTimeApplication`, etc.).
|
||||
- **Custom Schema**: Add a `.marte_schema.cue` file to your project root to extend or override definitions.
|
||||
|
||||
**Example `.marte_schema.cue`:**
|
||||
```cue
|
||||
package schema
|
||||
|
||||
#Classes: {
|
||||
MyCustomGAM: {
|
||||
Param1: int
|
||||
Param2?: string
|
||||
...
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Pragmas (Suppressing Warnings)
|
||||
|
||||
Use comments starting with `//!` to control validation behavior:
|
||||
|
||||
- `//!unused: Reason` - Suppress "Unused GAM" or "Unused Signal" warnings.
|
||||
- `//!implicit: Reason` - Suppress "Implicitly Defined Signal" warnings.
|
||||
- `//!cast(DefinedType, UsageType)` - Allow type mismatch between definition and usage (e.g. `//!cast(uint32, int32)`).
|
||||
- `//!allow(unused)` - Global suppression for the file.
|
||||
|
||||
## Development
|
||||
|
||||
### Building
|
||||
```bash
|
||||
go build ./cmd/mdt
|
||||
```
|
||||
|
||||
### Running Tests
|
||||
```bash
|
||||
go test ./...
|
||||
```
|
||||
|
||||
## License
|
||||
MIT
|
||||
@@ -4,13 +4,13 @@ import (
|
||||
"bytes"
|
||||
"os"
|
||||
|
||||
"github.com/marte-dev/marte-dev-tools/internal/builder"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/formatter"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/logger"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/lsp"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
||||
"github.com/marte-community/marte-dev-tools/internal/builder"
|
||||
"github.com/marte-community/marte-dev-tools/internal/formatter"
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/logger"
|
||||
"github.com/marte-community/marte-dev-tools/internal/lsp"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func main() {
|
||||
|
||||
27
examples/pragma_test.marte
Normal file
27
examples/pragma_test.marte
Normal file
@@ -0,0 +1,27 @@
|
||||
//!allow(unused): Ignore unused GAMs in this file
|
||||
//!allow(implicit): Ignore implicit signals in this file
|
||||
|
||||
+Data = {
|
||||
Class = ReferenceContainer
|
||||
+MyDS = {
|
||||
Class = FileReader
|
||||
Filename = "test"
|
||||
Signals = {}
|
||||
}
|
||||
}
|
||||
|
||||
+MyGAM = {
|
||||
Class = IOGAM
|
||||
InputSignals = {
|
||||
// Implicit signal (not in MyDS)
|
||||
ImplicitSig = {
|
||||
DataSource = MyDS
|
||||
Type = uint32
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Unused GAM
|
||||
+UnusedGAM = {
|
||||
Class = IOGAM
|
||||
}
|
||||
6417
examples/test_app.marte
Normal file
6417
examples/test_app.marte
Normal file
File diff suppressed because it is too large
Load Diff
17
go.mod
17
go.mod
@@ -1,3 +1,18 @@
|
||||
module github.com/marte-dev/marte-dev-tools
|
||||
module github.com/marte-community/marte-dev-tools
|
||||
|
||||
go 1.25.6
|
||||
|
||||
require cuelang.org/go v0.15.3
|
||||
|
||||
require (
|
||||
github.com/cockroachdb/apd/v3 v3.2.1 // indirect
|
||||
github.com/emicklei/proto v1.14.2 // indirect
|
||||
github.com/google/uuid v1.6.0 // indirect
|
||||
github.com/mitchellh/go-wordwrap v1.0.1 // indirect
|
||||
github.com/pelletier/go-toml/v2 v2.2.4 // indirect
|
||||
github.com/protocolbuffers/txtpbfmt v0.0.0-20251016062345-16587c79cd91 // indirect
|
||||
go.yaml.in/yaml/v3 v3.0.4 // indirect
|
||||
golang.org/x/net v0.46.0 // indirect
|
||||
golang.org/x/text v0.30.0 // indirect
|
||||
google.golang.org/protobuf v1.33.0 // indirect
|
||||
)
|
||||
|
||||
53
go.sum
Normal file
53
go.sum
Normal file
@@ -0,0 +1,53 @@
|
||||
cuelabs.dev/go/oci/ociregistry v0.0.0-20250722084951-074d06050084 h1:4k1yAtPvZJZQTu8DRY8muBo0LHv6TqtrE0AO5n6IPYs=
|
||||
cuelabs.dev/go/oci/ociregistry v0.0.0-20250722084951-074d06050084/go.mod h1:4WWeZNxUO1vRoZWAHIG0KZOd6dA25ypyWuwD3ti0Tdc=
|
||||
cuelang.org/go v0.15.3 h1:JKR/lZVwuIGlLTGIaJ0jONz9+CK3UDx06sQ6DDxNkaE=
|
||||
cuelang.org/go v0.15.3/go.mod h1:NYw6n4akZcTjA7QQwJ1/gqWrrhsN4aZwhcAL0jv9rZE=
|
||||
github.com/cockroachdb/apd/v3 v3.2.1 h1:U+8j7t0axsIgvQUqthuNm82HIrYXodOV2iWLWtEaIwg=
|
||||
github.com/cockroachdb/apd/v3 v3.2.1/go.mod h1:klXJcjp+FffLTHlhIG69tezTDvdP065naDsHzKhYSqc=
|
||||
github.com/emicklei/proto v1.14.2 h1:wJPxPy2Xifja9cEMrcA/g08art5+7CGJNFNk35iXC1I=
|
||||
github.com/emicklei/proto v1.14.2/go.mod h1:rn1FgRS/FANiZdD2djyH7TMA9jdRDcYQ9IEN9yvjX0A=
|
||||
github.com/go-quicktest/qt v1.101.0 h1:O1K29Txy5P2OK0dGo59b7b0LR6wKfIhttaAhHUyn7eI=
|
||||
github.com/go-quicktest/qt v1.101.0/go.mod h1:14Bz/f7NwaXPtdYEgzsx46kqSxVwTbzVZsDC26tQJow=
|
||||
github.com/google/go-cmp v0.7.0 h1:wk8382ETsv4JYUZwIsn6YpYiWiBsYLSJiTsyBybVuN8=
|
||||
github.com/google/go-cmp v0.7.0/go.mod h1:pXiqmnSA92OHEEa9HXL2W4E7lf9JzCmGVUdgjX3N/iU=
|
||||
github.com/google/uuid v1.6.0 h1:NIvaJDMOsjHA8n1jAhLSgzrAzy1Hgr+hNrb57e+94F0=
|
||||
github.com/google/uuid v1.6.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=
|
||||
github.com/kr/pretty v0.3.1 h1:flRD4NNwYAUpkphVc1HcthR4KEIFJ65n8Mw5qdRn3LE=
|
||||
github.com/kr/pretty v0.3.1/go.mod h1:hoEshYVHaxMs3cyo3Yncou5ZscifuDolrwPKZanG3xk=
|
||||
github.com/kr/text v0.2.0 h1:5Nx0Ya0ZqY2ygV366QzturHI13Jq95ApcVaJBhpS+AY=
|
||||
github.com/kr/text v0.2.0/go.mod h1:eLer722TekiGuMkidMxC/pM04lWEeraHUUmBw8l2grE=
|
||||
github.com/kylelemons/godebug v1.1.0 h1:RPNrshWIDI6G2gRW9EHilWtl7Z6Sb1BR0xunSBf0SNc=
|
||||
github.com/kylelemons/godebug v1.1.0/go.mod h1:9/0rRGxNHcop5bhtWyNeEfOS8JIWk580+fNqagV/RAw=
|
||||
github.com/lib/pq v1.10.7 h1:p7ZhMD+KsSRozJr34udlUrhboJwWAgCg34+/ZZNvZZw=
|
||||
github.com/lib/pq v1.10.7/go.mod h1:AlVN5x4E4T544tWzH6hKfbfQvm3HdbOxrmggDNAPY9o=
|
||||
github.com/mitchellh/go-wordwrap v1.0.1 h1:TLuKupo69TCn6TQSyGxwI1EblZZEsQ0vMlAFQflz0v0=
|
||||
github.com/mitchellh/go-wordwrap v1.0.1/go.mod h1:R62XHJLzvMFRBbcrT7m7WgmE1eOyTSsCt+hzestvNj0=
|
||||
github.com/opencontainers/go-digest v1.0.0 h1:apOUWs51W5PlhuyGyz9FCeeBIOUDA/6nW8Oi/yOhh5U=
|
||||
github.com/opencontainers/go-digest v1.0.0/go.mod h1:0JzlMkj0TRzQZfJkVvzbP0HBR3IKzErnv2BNG4W4MAM=
|
||||
github.com/opencontainers/image-spec v1.1.1 h1:y0fUlFfIZhPF1W537XOLg0/fcx6zcHCJwooC2xJA040=
|
||||
github.com/opencontainers/image-spec v1.1.1/go.mod h1:qpqAh3Dmcf36wStyyWU+kCeDgrGnAve2nCC8+7h8Q0M=
|
||||
github.com/pelletier/go-toml/v2 v2.2.4 h1:mye9XuhQ6gvn5h28+VilKrrPoQVanw5PMw/TB0t5Ec4=
|
||||
github.com/pelletier/go-toml/v2 v2.2.4/go.mod h1:2gIqNv+qfxSVS7cM2xJQKtLSTLUE9V8t9Stt+h56mCY=
|
||||
github.com/protocolbuffers/txtpbfmt v0.0.0-20251016062345-16587c79cd91 h1:s1LvMaU6mVwoFtbxv/rCZKE7/fwDmDY684FfUe4c1Io=
|
||||
github.com/protocolbuffers/txtpbfmt v0.0.0-20251016062345-16587c79cd91/go.mod h1:JSbkp0BviKovYYt9XunS95M3mLPibE9bGg+Y95DsEEY=
|
||||
github.com/rogpeppe/go-internal v1.14.1 h1:UQB4HGPB6osV0SQTLymcB4TgvyWu6ZyliaW0tI/otEQ=
|
||||
github.com/rogpeppe/go-internal v1.14.1/go.mod h1:MaRKkUm5W0goXpeCfT7UZI6fk/L7L7so1lCWt35ZSgc=
|
||||
go.yaml.in/yaml/v3 v3.0.4 h1:tfq32ie2Jv2UxXFdLJdh3jXuOzWiL1fo0bu/FbuKpbc=
|
||||
go.yaml.in/yaml/v3 v3.0.4/go.mod h1:DhzuOOF2ATzADvBadXxruRBLzYTpT36CKvDb3+aBEFg=
|
||||
golang.org/x/mod v0.29.0 h1:HV8lRxZC4l2cr3Zq1LvtOsi/ThTgWnUk/y64QSs8GwA=
|
||||
golang.org/x/mod v0.29.0/go.mod h1:NyhrlYXJ2H4eJiRy/WDBO6HMqZQ6q9nk4JzS3NuCK+w=
|
||||
golang.org/x/net v0.46.0 h1:giFlY12I07fugqwPuWJi68oOnpfqFnJIJzaIIm2JVV4=
|
||||
golang.org/x/net v0.46.0/go.mod h1:Q9BGdFy1y4nkUwiLvT5qtyhAnEHgnQ/zd8PfU6nc210=
|
||||
golang.org/x/oauth2 v0.32.0 h1:jsCblLleRMDrxMN29H3z/k1KliIvpLgCkE6R8FXXNgY=
|
||||
golang.org/x/oauth2 v0.32.0/go.mod h1:lzm5WQJQwKZ3nwavOZ3IS5Aulzxi68dUSgRHujetwEA=
|
||||
golang.org/x/sync v0.17.0 h1:l60nONMj9l5drqw6jlhIELNv9I0A4OFgRsG9k2oT9Ug=
|
||||
golang.org/x/sync v0.17.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
|
||||
golang.org/x/text v0.30.0 h1:yznKA/E9zq54KzlzBEAWn1NXSQ8DIp/NYMy88xJjl4k=
|
||||
golang.org/x/text v0.30.0/go.mod h1:yDdHFIX9t+tORqspjENWgzaCVXgk0yYnYuSZ8UzzBVM=
|
||||
golang.org/x/tools v0.38.0 h1:Hx2Xv8hISq8Lm16jvBZ2VQf+RLmbd7wVUsALibYI/IQ=
|
||||
golang.org/x/tools v0.38.0/go.mod h1:yEsQ/d/YK8cjh0L6rZlY8tgtlKiBNTL14pGDJPJpYQs=
|
||||
google.golang.org/protobuf v1.33.0 h1:uNO2rsAINq/JlFpSdYEKIZ0uKD/R9cpdv0T+yoGwGmI=
|
||||
google.golang.org/protobuf v1.33.0/go.mod h1:c6P6GXX6sHbq/GpV6MGZEdwhWPcYBgnhAHhKbcUYpos=
|
||||
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
|
||||
gopkg.in/check.v1 v1.0.0-20180628173108-788fd7840127 h1:qIbj1fsPNlZgppZ+VLlY7N33q108Sa+fhmuc+sWQYwY=
|
||||
gopkg.in/check.v1 v1.0.0-20180628173108-788fd7840127/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
|
||||
@@ -6,8 +6,8 @@ import (
|
||||
"sort"
|
||||
"strings"
|
||||
|
||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
)
|
||||
|
||||
type Builder struct {
|
||||
@@ -71,86 +71,38 @@ func (b *Builder) writeNodeContent(f *os.File, node *index.ProjectNode, indent i
|
||||
indentStr := strings.Repeat(" ", indent)
|
||||
|
||||
// If this node has a RealName (e.g. +App), we print it as an object definition
|
||||
// UNLESS it is the top-level output file itself?
|
||||
// If we are writing "App.marte", maybe we are writing the *body* of App?
|
||||
// Spec: "unifying multi-file project into a single configuration output"
|
||||
|
||||
// Let's assume we print the Node itself.
|
||||
if node.RealName != "" {
|
||||
fmt.Fprintf(f, "%s%s = {\n", indentStr, node.RealName)
|
||||
indent++
|
||||
indentStr = strings.Repeat(" ", indent)
|
||||
}
|
||||
|
||||
writtenChildren := make(map[string]bool)
|
||||
|
||||
// 2. Write definitions from fragments
|
||||
for _, frag := range node.Fragments {
|
||||
// Use formatter logic to print definitions
|
||||
// We need a temporary Config to use Formatter?
|
||||
// Or just reimplement basic printing? Formatter is better.
|
||||
// But Formatter prints to io.Writer.
|
||||
|
||||
// We can reuse formatDefinition logic if we exposed it, or just copy basic logic.
|
||||
// Since we need to respect indentation, using Formatter.Format might be tricky
|
||||
// unless we wrap definitions in a dummy structure.
|
||||
|
||||
for _, def := range frag.Definitions {
|
||||
// Basic formatting for now, referencing formatter style
|
||||
b.writeDefinition(f, def, indent)
|
||||
switch d := def.(type) {
|
||||
case *parser.Field:
|
||||
b.writeDefinition(f, d, indent)
|
||||
case *parser.ObjectNode:
|
||||
norm := index.NormalizeName(d.Name)
|
||||
if child, ok := node.Children[norm]; ok {
|
||||
if !writtenChildren[norm] {
|
||||
b.writeNodeContent(f, child, indent)
|
||||
writtenChildren[norm] = true
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 3. Write Children (recursively)
|
||||
// Children are sub-nodes defined implicitly via #package A.B or explicitly +Sub
|
||||
// Explicit +Sub are handled via Fragments logic (they are definitions in fragments).
|
||||
// Implicit nodes (from #package A.B.C where B was never explicitly defined)
|
||||
// show up in Children map but maybe not in Fragments?
|
||||
|
||||
// If a Child is NOT in fragments (implicit), we still need to write it.
|
||||
// If it IS in fragments (explicit +Child), it was handled in loop above?
|
||||
// Wait. My Indexer puts `+Sub` into `node.Children["Sub"]` AND adds a `Fragment` to `node` containing `+Sub` object?
|
||||
|
||||
// Let's check Indexer.
|
||||
// Case ObjectNode:
|
||||
// Adds Fragment to `child` (the Sub node).
|
||||
// Does NOT add `ObjectNode` definition to `node`'s fragment list?
|
||||
// "pt.addObjectFragment(child...)"
|
||||
// It does NOT add to `fileFragment.Definitions`.
|
||||
|
||||
// So `node.Fragments` only contains Fields!
|
||||
// Children are all in `node.Children`.
|
||||
|
||||
// So:
|
||||
// 1. Write Fields (from Fragments).
|
||||
// 2. Write Children (from Children map).
|
||||
|
||||
// But wait, Fragments might have order?
|
||||
// "Relative ordering within a file is preserved."
|
||||
// My Indexer splits Fields and Objects.
|
||||
// Fields go to Fragments. Objects go to Children.
|
||||
// This loses the relative order between Fields and Objects in the source file!
|
||||
|
||||
// Correct Indexer approach for preserving order:
|
||||
// `Fragment` should contain a list of `Entry`.
|
||||
// `Entry` can be `Field` OR `ChildNodeName`.
|
||||
|
||||
// But I just rewrote Indexer to split them.
|
||||
// If strict order is required "within a file", my Indexer is slightly lossy regarding Field vs Object order.
|
||||
// Spec: "Relative ordering within a file is preserved."
|
||||
|
||||
// To fix this without another full rewrite:
|
||||
// Iterating `node.Children` alphabetically is arbitrary.
|
||||
// We should ideally iterate them in the order they appear.
|
||||
|
||||
// For now, I will proceed with writing Children after Fields, which is a common convention,
|
||||
// unless strict interleaving is required.
|
||||
// Given "Class first" rule, reordering happens anyway.
|
||||
|
||||
// Sorting Children?
|
||||
// Maybe keep a list of OrderedChildren in ProjectNode?
|
||||
|
||||
sortedChildren := make([]string, 0, len(node.Children))
|
||||
for k := range node.Children {
|
||||
sortedChildren = append(sortedChildren, k)
|
||||
if !writtenChildren[k] {
|
||||
sortedChildren = append(sortedChildren, k)
|
||||
}
|
||||
}
|
||||
sort.Strings(sortedChildren) // Alphabetical for determinism
|
||||
|
||||
|
||||
@@ -6,7 +6,7 @@ import (
|
||||
"sort"
|
||||
"strings"
|
||||
|
||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
)
|
||||
|
||||
type Insertable struct {
|
||||
@@ -54,7 +54,7 @@ func fixComment(text string) string {
|
||||
return "//# " + text[3:]
|
||||
}
|
||||
} else if strings.HasPrefix(text, "//") {
|
||||
if len(text) > 2 && text[2] != ' ' && text[2] != '#' && text[2] != '!' {
|
||||
if len(text) > 2 && text[2] != ' ' && text[2] != '#' && text[2] != '!' {
|
||||
return "// " + text[2:]
|
||||
}
|
||||
}
|
||||
@@ -101,7 +101,7 @@ func (f *Formatter) formatDefinition(def parser.Definition, indent int) int {
|
||||
fmt.Fprintln(f.writer)
|
||||
|
||||
f.formatSubnode(d.Subnode, indent+1)
|
||||
|
||||
|
||||
fmt.Fprintf(f.writer, "%s}", indentStr)
|
||||
return d.Subnode.EndPosition.Line
|
||||
}
|
||||
@@ -175,7 +175,7 @@ func (f *Formatter) flushCommentsBefore(pos parser.Position, indent int, stick b
|
||||
break
|
||||
}
|
||||
}
|
||||
// If stick is true, we don't print extra newline.
|
||||
// If stick is true, we don't print extra newline.
|
||||
// The caller will print the definition immediately after this function returns.
|
||||
// If stick is false (e.g. end of block comments), we act normally.
|
||||
// But actually, the previous implementation didn't print extra newlines between comments and code
|
||||
@@ -208,4 +208,4 @@ func (f *Formatter) popComment() string {
|
||||
c := f.insertables[f.cursor]
|
||||
f.cursor++
|
||||
return c.Text
|
||||
}
|
||||
}
|
||||
|
||||
@@ -5,14 +5,15 @@ import (
|
||||
"path/filepath"
|
||||
"strings"
|
||||
|
||||
"github.com/marte-dev/marte-dev-tools/internal/logger"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/logger"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
)
|
||||
|
||||
type ProjectTree struct {
|
||||
Root *ProjectNode
|
||||
References []Reference
|
||||
IsolatedFiles map[string]*ProjectNode
|
||||
GlobalPragmas map[string][]string
|
||||
}
|
||||
|
||||
func (pt *ProjectTree) ScanDirectory(rootPath string) error {
|
||||
@@ -50,6 +51,8 @@ type ProjectNode struct {
|
||||
Children map[string]*ProjectNode
|
||||
Parent *ProjectNode
|
||||
Metadata map[string]string // Store extra info like Class, Type, Size
|
||||
Target *ProjectNode // Points to referenced node (for Direct References/Links)
|
||||
Pragmas []string
|
||||
}
|
||||
|
||||
type Fragment struct {
|
||||
@@ -57,6 +60,7 @@ type Fragment struct {
|
||||
Definitions []parser.Definition
|
||||
IsObject bool
|
||||
ObjectPos parser.Position
|
||||
EndPos parser.Position
|
||||
Doc string // Documentation for this fragment (if object)
|
||||
}
|
||||
|
||||
@@ -67,6 +71,7 @@ func NewProjectTree() *ProjectTree {
|
||||
Metadata: make(map[string]string),
|
||||
},
|
||||
IsolatedFiles: make(map[string]*ProjectNode),
|
||||
GlobalPragmas: make(map[string][]string),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -87,6 +92,7 @@ func (pt *ProjectTree) RemoveFile(file string) {
|
||||
pt.References = newRefs
|
||||
|
||||
delete(pt.IsolatedFiles, file)
|
||||
delete(pt.GlobalPragmas, file)
|
||||
pt.removeFileFromNode(pt.Root, file)
|
||||
}
|
||||
|
||||
@@ -154,6 +160,15 @@ func (pt *ProjectTree) extractFieldMetadata(node *ProjectNode, f *parser.Field)
|
||||
func (pt *ProjectTree) AddFile(file string, config *parser.Configuration) {
|
||||
pt.RemoveFile(file)
|
||||
|
||||
// Collect global pragmas
|
||||
for _, p := range config.Pragmas {
|
||||
txt := strings.TrimSpace(strings.TrimPrefix(p.Text, "//!"))
|
||||
normalized := strings.ReplaceAll(txt, " ", "")
|
||||
if strings.HasPrefix(normalized, "allow(") || strings.HasPrefix(normalized, "ignore(") {
|
||||
pt.GlobalPragmas[file] = append(pt.GlobalPragmas[file], txt)
|
||||
}
|
||||
}
|
||||
|
||||
if config.Package == nil {
|
||||
node := &ProjectNode{
|
||||
Children: make(map[string]*ProjectNode),
|
||||
@@ -200,12 +215,14 @@ func (pt *ProjectTree) populateNode(node *ProjectNode, file string, config *pars
|
||||
|
||||
for _, def := range config.Definitions {
|
||||
doc := pt.findDoc(config.Comments, def.Pos())
|
||||
pragmas := pt.findPragmas(config.Pragmas, def.Pos())
|
||||
|
||||
switch d := def.(type) {
|
||||
case *parser.Field:
|
||||
fileFragment.Definitions = append(fileFragment.Definitions, d)
|
||||
pt.indexValue(file, d.Value)
|
||||
case *parser.ObjectNode:
|
||||
fileFragment.Definitions = append(fileFragment.Definitions, d)
|
||||
norm := NormalizeName(d.Name)
|
||||
if _, ok := node.Children[norm]; !ok {
|
||||
node.Children[norm] = &ProjectNode{
|
||||
@@ -228,7 +245,11 @@ func (pt *ProjectTree) populateNode(node *ProjectNode, file string, config *pars
|
||||
child.Doc += doc
|
||||
}
|
||||
|
||||
pt.addObjectFragment(child, file, d, doc, config.Comments)
|
||||
if len(pragmas) > 0 {
|
||||
child.Pragmas = append(child.Pragmas, pragmas...)
|
||||
}
|
||||
|
||||
pt.addObjectFragment(child, file, d, doc, config.Comments, config.Pragmas)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -237,16 +258,18 @@ func (pt *ProjectTree) populateNode(node *ProjectNode, file string, config *pars
|
||||
}
|
||||
}
|
||||
|
||||
func (pt *ProjectTree) addObjectFragment(node *ProjectNode, file string, obj *parser.ObjectNode, doc string, comments []parser.Comment) {
|
||||
func (pt *ProjectTree) addObjectFragment(node *ProjectNode, file string, obj *parser.ObjectNode, doc string, comments []parser.Comment, pragmas []parser.Pragma) {
|
||||
frag := &Fragment{
|
||||
File: file,
|
||||
IsObject: true,
|
||||
ObjectPos: obj.Position,
|
||||
EndPos: obj.Subnode.EndPosition,
|
||||
Doc: doc,
|
||||
}
|
||||
|
||||
for _, def := range obj.Subnode.Definitions {
|
||||
subDoc := pt.findDoc(comments, def.Pos())
|
||||
subPragmas := pt.findPragmas(pragmas, def.Pos())
|
||||
|
||||
switch d := def.(type) {
|
||||
case *parser.Field:
|
||||
@@ -254,6 +277,7 @@ func (pt *ProjectTree) addObjectFragment(node *ProjectNode, file string, obj *pa
|
||||
pt.indexValue(file, d.Value)
|
||||
pt.extractFieldMetadata(node, d)
|
||||
case *parser.ObjectNode:
|
||||
frag.Definitions = append(frag.Definitions, d)
|
||||
norm := NormalizeName(d.Name)
|
||||
if _, ok := node.Children[norm]; !ok {
|
||||
node.Children[norm] = &ProjectNode{
|
||||
@@ -276,7 +300,11 @@ func (pt *ProjectTree) addObjectFragment(node *ProjectNode, file string, obj *pa
|
||||
child.Doc += subDoc
|
||||
}
|
||||
|
||||
pt.addObjectFragment(child, file, d, subDoc, comments)
|
||||
if len(subPragmas) > 0 {
|
||||
child.Pragmas = append(child.Pragmas, subPragmas...)
|
||||
}
|
||||
|
||||
pt.addObjectFragment(child, file, d, subDoc, comments, pragmas)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -321,6 +349,30 @@ func (pt *ProjectTree) findDoc(comments []parser.Comment, pos parser.Position) s
|
||||
return docBuilder.String()
|
||||
}
|
||||
|
||||
func (pt *ProjectTree) findPragmas(pragmas []parser.Pragma, pos parser.Position) []string {
|
||||
var found []string
|
||||
targetLine := pos.Line - 1
|
||||
|
||||
for i := len(pragmas) - 1; i >= 0; i-- {
|
||||
p := pragmas[i]
|
||||
if p.Position.Line > pos.Line {
|
||||
continue
|
||||
}
|
||||
if p.Position.Line == pos.Line {
|
||||
continue
|
||||
}
|
||||
|
||||
if p.Position.Line == targetLine {
|
||||
txt := strings.TrimSpace(strings.TrimPrefix(p.Text, "//!"))
|
||||
found = append(found, txt)
|
||||
targetLine--
|
||||
} else if p.Position.Line < targetLine {
|
||||
break
|
||||
}
|
||||
}
|
||||
return found
|
||||
}
|
||||
|
||||
func (pt *ProjectTree) indexValue(file string, val parser.Value) {
|
||||
switch v := val.(type) {
|
||||
case *parser.ReferenceValue:
|
||||
@@ -340,25 +392,65 @@ func (pt *ProjectTree) ResolveReferences() {
|
||||
for i := range pt.References {
|
||||
ref := &pt.References[i]
|
||||
if isoNode, ok := pt.IsolatedFiles[ref.File]; ok {
|
||||
ref.Target = pt.findNode(isoNode, ref.Name)
|
||||
ref.Target = pt.FindNode(isoNode, ref.Name, nil)
|
||||
} else {
|
||||
ref.Target = pt.findNode(pt.Root, ref.Name)
|
||||
ref.Target = pt.FindNode(pt.Root, ref.Name, nil)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func (pt *ProjectTree) findNode(root *ProjectNode, name string) *ProjectNode {
|
||||
func (pt *ProjectTree) FindNode(root *ProjectNode, name string, predicate func(*ProjectNode) bool) *ProjectNode {
|
||||
if strings.Contains(name, ".") {
|
||||
parts := strings.Split(name, ".")
|
||||
rootName := parts[0]
|
||||
|
||||
var candidates []*ProjectNode
|
||||
pt.findAllNodes(root, rootName, &candidates)
|
||||
|
||||
for _, cand := range candidates {
|
||||
curr := cand
|
||||
valid := true
|
||||
for i := 1; i < len(parts); i++ {
|
||||
nextName := parts[i]
|
||||
normNext := NormalizeName(nextName)
|
||||
if child, ok := curr.Children[normNext]; ok {
|
||||
curr = child
|
||||
} else {
|
||||
valid = false
|
||||
break
|
||||
}
|
||||
}
|
||||
if valid {
|
||||
if predicate == nil || predicate(curr) {
|
||||
return curr
|
||||
}
|
||||
}
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
if root.RealName == name || root.Name == name {
|
||||
return root
|
||||
if predicate == nil || predicate(root) {
|
||||
return root
|
||||
}
|
||||
}
|
||||
for _, child := range root.Children {
|
||||
if res := pt.findNode(child, name); res != nil {
|
||||
if res := pt.FindNode(child, name, predicate); res != nil {
|
||||
return res
|
||||
}
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func (pt *ProjectTree) findAllNodes(root *ProjectNode, name string, results *[]*ProjectNode) {
|
||||
if root.RealName == name || root.Name == name {
|
||||
*results = append(*results, root)
|
||||
}
|
||||
for _, child := range root.Children {
|
||||
pt.findAllNodes(child, name, results)
|
||||
}
|
||||
}
|
||||
|
||||
type QueryResult struct {
|
||||
Node *ProjectNode
|
||||
Field *parser.Field
|
||||
@@ -384,6 +476,22 @@ func (pt *ProjectTree) Query(file string, line, col int) *QueryResult {
|
||||
return pt.queryNode(pt.Root, file, line, col)
|
||||
}
|
||||
|
||||
func (pt *ProjectTree) Walk(visitor func(*ProjectNode)) {
|
||||
if pt.Root != nil {
|
||||
pt.walkRecursive(pt.Root, visitor)
|
||||
}
|
||||
for _, node := range pt.IsolatedFiles {
|
||||
pt.walkRecursive(node, visitor)
|
||||
}
|
||||
}
|
||||
|
||||
func (pt *ProjectTree) walkRecursive(node *ProjectNode, visitor func(*ProjectNode)) {
|
||||
visitor(node)
|
||||
for _, child := range node.Children {
|
||||
pt.walkRecursive(child, visitor)
|
||||
}
|
||||
}
|
||||
|
||||
func (pt *ProjectTree) queryNode(node *ProjectNode, file string, line, col int) *QueryResult {
|
||||
for _, frag := range node.Fragments {
|
||||
if frag.File == file {
|
||||
@@ -410,3 +518,44 @@ func (pt *ProjectTree) queryNode(node *ProjectNode, file string, line, col int)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func (pt *ProjectTree) GetNodeContaining(file string, pos parser.Position) *ProjectNode {
|
||||
if isoNode, ok := pt.IsolatedFiles[file]; ok {
|
||||
if found := pt.findNodeContaining(isoNode, file, pos); found != nil {
|
||||
return found
|
||||
}
|
||||
return isoNode
|
||||
}
|
||||
if pt.Root != nil {
|
||||
if found := pt.findNodeContaining(pt.Root, file, pos); found != nil {
|
||||
return found
|
||||
}
|
||||
for _, frag := range pt.Root.Fragments {
|
||||
if frag.File == file && !frag.IsObject {
|
||||
return pt.Root
|
||||
}
|
||||
}
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func (pt *ProjectTree) findNodeContaining(node *ProjectNode, file string, pos parser.Position) *ProjectNode {
|
||||
for _, child := range node.Children {
|
||||
if res := pt.findNodeContaining(child, file, pos); res != nil {
|
||||
return res
|
||||
}
|
||||
}
|
||||
|
||||
for _, frag := range node.Fragments {
|
||||
if frag.File == file && frag.IsObject {
|
||||
start := frag.ObjectPos
|
||||
end := frag.EndPos
|
||||
|
||||
if (pos.Line > start.Line || (pos.Line == start.Line && pos.Column >= start.Column)) &&
|
||||
(pos.Line < end.Line || (pos.Line == end.Line && pos.Column <= end.Column)) {
|
||||
return node
|
||||
}
|
||||
}
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
@@ -7,15 +7,51 @@ import (
|
||||
"fmt"
|
||||
"io"
|
||||
"os"
|
||||
"regexp"
|
||||
"strings"
|
||||
|
||||
"github.com/marte-dev/marte-dev-tools/internal/formatter"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/logger"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
||||
"github.com/marte-community/marte-dev-tools/internal/formatter"
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/logger"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/schema"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
|
||||
"cuelang.org/go/cue"
|
||||
)
|
||||
|
||||
|
||||
|
||||
type CompletionParams struct {
|
||||
TextDocument TextDocumentIdentifier `json:"textDocument"`
|
||||
Position Position `json:"position"`
|
||||
Context CompletionContext `json:"context,omitempty"`
|
||||
}
|
||||
|
||||
type CompletionContext struct {
|
||||
TriggerKind int `json:"triggerKind"`
|
||||
}
|
||||
|
||||
type CompletionItem struct {
|
||||
Label string `json:"label"`
|
||||
Kind int `json:"kind"`
|
||||
Detail string `json:"detail,omitempty"`
|
||||
Documentation string `json:"documentation,omitempty"`
|
||||
InsertText string `json:"insertText,omitempty"`
|
||||
InsertTextFormat int `json:"insertTextFormat,omitempty"` // 1: PlainText, 2: Snippet
|
||||
SortText string `json:"sortText,omitempty"`
|
||||
}
|
||||
|
||||
type CompletionList struct {
|
||||
IsIncomplete bool `json:"isIncomplete"`
|
||||
Items []CompletionItem `json:"items"`
|
||||
}
|
||||
|
||||
var Tree = index.NewProjectTree()
|
||||
var Documents = make(map[string]string)
|
||||
var ProjectRoot string
|
||||
var GlobalSchema *schema.Schema
|
||||
|
||||
type JsonRpcMessage struct {
|
||||
Jsonrpc string `json:"jsonrpc"`
|
||||
Method string `json:"method,omitempty"`
|
||||
@@ -135,9 +171,6 @@ type TextEdit struct {
|
||||
NewText string `json:"newText"`
|
||||
}
|
||||
|
||||
var tree = index.NewProjectTree()
|
||||
var documents = make(map[string]string)
|
||||
var projectRoot string
|
||||
|
||||
func RunServer() {
|
||||
reader := bufio.NewReader(os.Stdin)
|
||||
@@ -151,7 +184,7 @@ func RunServer() {
|
||||
continue
|
||||
}
|
||||
|
||||
handleMessage(msg)
|
||||
HandleMessage(msg)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -181,7 +214,7 @@ func readMessage(reader *bufio.Reader) (*JsonRpcMessage, error) {
|
||||
return &msg, err
|
||||
}
|
||||
|
||||
func handleMessage(msg *JsonRpcMessage) {
|
||||
func HandleMessage(msg *JsonRpcMessage) {
|
||||
switch msg.Method {
|
||||
case "initialize":
|
||||
var params InitializeParams
|
||||
@@ -192,12 +225,15 @@ func handleMessage(msg *JsonRpcMessage) {
|
||||
} else if params.RootPath != "" {
|
||||
root = params.RootPath
|
||||
}
|
||||
|
||||
|
||||
if root != "" {
|
||||
projectRoot = root
|
||||
ProjectRoot = root
|
||||
logger.Printf("Scanning workspace: %s\n", root)
|
||||
tree.ScanDirectory(root)
|
||||
tree.ResolveReferences()
|
||||
if err := Tree.ScanDirectory(root); err != nil {
|
||||
logger.Printf("ScanDirectory failed: %v\n", err)
|
||||
}
|
||||
Tree.ResolveReferences()
|
||||
GlobalSchema = schema.LoadFullSchema(ProjectRoot)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -208,6 +244,9 @@ func handleMessage(msg *JsonRpcMessage) {
|
||||
"definitionProvider": true,
|
||||
"referencesProvider": true,
|
||||
"documentFormattingProvider": true,
|
||||
"completionProvider": map[string]any{
|
||||
"triggerCharacters": []string{"=", " "},
|
||||
},
|
||||
},
|
||||
})
|
||||
case "initialized":
|
||||
@@ -219,18 +258,18 @@ func handleMessage(msg *JsonRpcMessage) {
|
||||
case "textDocument/didOpen":
|
||||
var params DidOpenTextDocumentParams
|
||||
if err := json.Unmarshal(msg.Params, ¶ms); err == nil {
|
||||
handleDidOpen(params)
|
||||
HandleDidOpen(params)
|
||||
}
|
||||
case "textDocument/didChange":
|
||||
var params DidChangeTextDocumentParams
|
||||
if err := json.Unmarshal(msg.Params, ¶ms); err == nil {
|
||||
handleDidChange(params)
|
||||
HandleDidChange(params)
|
||||
}
|
||||
case "textDocument/hover":
|
||||
var params HoverParams
|
||||
if err := json.Unmarshal(msg.Params, ¶ms); err == nil {
|
||||
logger.Printf("Hover: %s:%d", params.TextDocument.URI, params.Position.Line)
|
||||
res := handleHover(params)
|
||||
res := HandleHover(params)
|
||||
if res != nil {
|
||||
logger.Printf("Res: %v", res.Contents)
|
||||
} else {
|
||||
@@ -244,17 +283,22 @@ func handleMessage(msg *JsonRpcMessage) {
|
||||
case "textDocument/definition":
|
||||
var params DefinitionParams
|
||||
if err := json.Unmarshal(msg.Params, ¶ms); err == nil {
|
||||
respond(msg.ID, handleDefinition(params))
|
||||
respond(msg.ID, HandleDefinition(params))
|
||||
}
|
||||
case "textDocument/references":
|
||||
var params ReferenceParams
|
||||
if err := json.Unmarshal(msg.Params, ¶ms); err == nil {
|
||||
respond(msg.ID, handleReferences(params))
|
||||
respond(msg.ID, HandleReferences(params))
|
||||
}
|
||||
case "textDocument/completion":
|
||||
var params CompletionParams
|
||||
if err := json.Unmarshal(msg.Params, ¶ms); err == nil {
|
||||
respond(msg.ID, HandleCompletion(params))
|
||||
}
|
||||
case "textDocument/formatting":
|
||||
var params DocumentFormattingParams
|
||||
if err := json.Unmarshal(msg.Params, ¶ms); err == nil {
|
||||
respond(msg.ID, handleFormatting(params))
|
||||
respond(msg.ID, HandleFormatting(params))
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -263,37 +307,51 @@ func uriToPath(uri string) string {
|
||||
return strings.TrimPrefix(uri, "file://")
|
||||
}
|
||||
|
||||
func handleDidOpen(params DidOpenTextDocumentParams) {
|
||||
func HandleDidOpen(params DidOpenTextDocumentParams) {
|
||||
path := uriToPath(params.TextDocument.URI)
|
||||
documents[params.TextDocument.URI] = params.TextDocument.Text
|
||||
Documents[params.TextDocument.URI] = params.TextDocument.Text
|
||||
p := parser.NewParser(params.TextDocument.Text)
|
||||
config, err := p.Parse()
|
||||
if err == nil {
|
||||
tree.AddFile(path, config)
|
||||
tree.ResolveReferences()
|
||||
|
||||
if err != nil {
|
||||
publishParserError(params.TextDocument.URI, err)
|
||||
} else {
|
||||
publishParserError(params.TextDocument.URI, nil)
|
||||
}
|
||||
|
||||
if config != nil {
|
||||
Tree.AddFile(path, config)
|
||||
Tree.ResolveReferences()
|
||||
runValidation(params.TextDocument.URI)
|
||||
}
|
||||
}
|
||||
|
||||
func handleDidChange(params DidChangeTextDocumentParams) {
|
||||
func HandleDidChange(params DidChangeTextDocumentParams) {
|
||||
if len(params.ContentChanges) == 0 {
|
||||
return
|
||||
}
|
||||
text := params.ContentChanges[0].Text
|
||||
documents[params.TextDocument.URI] = text
|
||||
Documents[params.TextDocument.URI] = text
|
||||
path := uriToPath(params.TextDocument.URI)
|
||||
p := parser.NewParser(text)
|
||||
config, err := p.Parse()
|
||||
if err == nil {
|
||||
tree.AddFile(path, config)
|
||||
tree.ResolveReferences()
|
||||
|
||||
if err != nil {
|
||||
publishParserError(params.TextDocument.URI, err)
|
||||
} else {
|
||||
publishParserError(params.TextDocument.URI, nil)
|
||||
}
|
||||
|
||||
if config != nil {
|
||||
Tree.AddFile(path, config)
|
||||
Tree.ResolveReferences()
|
||||
runValidation(params.TextDocument.URI)
|
||||
}
|
||||
}
|
||||
|
||||
func handleFormatting(params DocumentFormattingParams) []TextEdit {
|
||||
func HandleFormatting(params DocumentFormattingParams) []TextEdit {
|
||||
uri := params.TextDocument.URI
|
||||
text, ok := documents[uri]
|
||||
text, ok := Documents[uri]
|
||||
if !ok {
|
||||
return nil
|
||||
}
|
||||
@@ -325,17 +383,17 @@ func handleFormatting(params DocumentFormattingParams) []TextEdit {
|
||||
}
|
||||
|
||||
func runValidation(uri string) {
|
||||
v := validator.NewValidator(tree, projectRoot)
|
||||
v := validator.NewValidator(Tree, ProjectRoot)
|
||||
v.ValidateProject()
|
||||
v.CheckUnused()
|
||||
|
||||
// Group diagnostics by file
|
||||
fileDiags := make(map[string][]LSPDiagnostic)
|
||||
|
||||
|
||||
// Collect all known files to ensure we clear diagnostics for fixed files
|
||||
knownFiles := make(map[string]bool)
|
||||
collectFiles(tree.Root, knownFiles)
|
||||
|
||||
collectFiles(Tree.Root, knownFiles)
|
||||
|
||||
// Initialize all known files with empty diagnostics
|
||||
for f := range knownFiles {
|
||||
fileDiags[f] = []LSPDiagnostic{}
|
||||
@@ -356,7 +414,7 @@ func runValidation(uri string) {
|
||||
Message: d.Message,
|
||||
Source: "mdt",
|
||||
}
|
||||
|
||||
|
||||
path := d.File
|
||||
if path != "" {
|
||||
fileDiags[path] = append(fileDiags[path], diag)
|
||||
@@ -369,7 +427,7 @@ func runValidation(uri string) {
|
||||
notification := JsonRpcMessage{
|
||||
Jsonrpc: "2.0",
|
||||
Method: "textDocument/publishDiagnostics",
|
||||
Params: mustMarshal(PublishDiagnosticsParams{
|
||||
Params: mustMarshal(PublishDiagnosticsParams{
|
||||
URI: fileURI,
|
||||
Diagnostics: diags,
|
||||
}),
|
||||
@@ -378,6 +436,57 @@ func runValidation(uri string) {
|
||||
}
|
||||
}
|
||||
|
||||
func publishParserError(uri string, err error) {
|
||||
if err == nil {
|
||||
notification := JsonRpcMessage{
|
||||
Jsonrpc: "2.0",
|
||||
Method: "textDocument/publishDiagnostics",
|
||||
Params: mustMarshal(PublishDiagnosticsParams{
|
||||
URI: uri,
|
||||
Diagnostics: []LSPDiagnostic{},
|
||||
}),
|
||||
}
|
||||
send(notification)
|
||||
return
|
||||
}
|
||||
|
||||
var line, col int
|
||||
var msg string
|
||||
// Try parsing "line:col: message"
|
||||
n, _ := fmt.Sscanf(err.Error(), "%d:%d: ", &line, &col)
|
||||
if n == 2 {
|
||||
parts := strings.SplitN(err.Error(), ": ", 2)
|
||||
if len(parts) == 2 {
|
||||
msg = parts[1]
|
||||
}
|
||||
} else {
|
||||
// Fallback
|
||||
line = 1
|
||||
col = 1
|
||||
msg = err.Error()
|
||||
}
|
||||
|
||||
diag := LSPDiagnostic{
|
||||
Range: Range{
|
||||
Start: Position{Line: line - 1, Character: col - 1},
|
||||
End: Position{Line: line - 1, Character: col},
|
||||
},
|
||||
Severity: 1, // Error
|
||||
Message: msg,
|
||||
Source: "mdt-parser",
|
||||
}
|
||||
|
||||
notification := JsonRpcMessage{
|
||||
Jsonrpc: "2.0",
|
||||
Method: "textDocument/publishDiagnostics",
|
||||
Params: mustMarshal(PublishDiagnosticsParams{
|
||||
URI: uri,
|
||||
Diagnostics: []LSPDiagnostic{diag},
|
||||
}),
|
||||
}
|
||||
send(notification)
|
||||
}
|
||||
|
||||
func collectFiles(node *index.ProjectNode, files map[string]bool) {
|
||||
for _, frag := range node.Fragments {
|
||||
files[frag.File] = true
|
||||
@@ -392,12 +501,12 @@ func mustMarshal(v any) json.RawMessage {
|
||||
return b
|
||||
}
|
||||
|
||||
func handleHover(params HoverParams) *Hover {
|
||||
func HandleHover(params HoverParams) *Hover {
|
||||
path := uriToPath(params.TextDocument.URI)
|
||||
line := params.Position.Line + 1
|
||||
col := params.Position.Character + 1
|
||||
|
||||
res := tree.Query(path, line, col)
|
||||
res := Tree.Query(path, line, col)
|
||||
if res == nil {
|
||||
logger.Printf("No object/node/reference found")
|
||||
return nil
|
||||
@@ -406,7 +515,11 @@ func handleHover(params HoverParams) *Hover {
|
||||
var content string
|
||||
|
||||
if res.Node != nil {
|
||||
content = formatNodeInfo(res.Node)
|
||||
if res.Node.Target != nil {
|
||||
content = fmt.Sprintf("**Link**: `%s` -> `%s`\n\n%s", res.Node.RealName, res.Node.Target.RealName, formatNodeInfo(res.Node.Target))
|
||||
} else {
|
||||
content = formatNodeInfo(res.Node)
|
||||
}
|
||||
} else if res.Field != nil {
|
||||
content = fmt.Sprintf("**Field**: `%s`", res.Field.Name)
|
||||
} else if res.Reference != nil {
|
||||
@@ -440,12 +553,314 @@ func handleHover(params HoverParams) *Hover {
|
||||
}
|
||||
}
|
||||
|
||||
func handleDefinition(params DefinitionParams) any {
|
||||
func HandleCompletion(params CompletionParams) *CompletionList {
|
||||
uri := params.TextDocument.URI
|
||||
path := uriToPath(uri)
|
||||
text, ok := Documents[uri]
|
||||
if !ok {
|
||||
return nil
|
||||
}
|
||||
|
||||
lines := strings.Split(text, "\n")
|
||||
if params.Position.Line >= len(lines) {
|
||||
return nil
|
||||
}
|
||||
lineStr := lines[params.Position.Line]
|
||||
|
||||
col := params.Position.Character
|
||||
if col > len(lineStr) {
|
||||
col = len(lineStr)
|
||||
}
|
||||
|
||||
prefix := lineStr[:col]
|
||||
|
||||
// Case 1: Assigning a value (Ends with "=" or "= ")
|
||||
if strings.Contains(prefix, "=") {
|
||||
lastIdx := strings.LastIndex(prefix, "=")
|
||||
beforeEqual := prefix[:lastIdx]
|
||||
|
||||
// Find the last identifier before '='
|
||||
key := ""
|
||||
re := regexp.MustCompile(`[a-zA-Z][a-zA-Z0-9_\-]*`)
|
||||
matches := re.FindAllString(beforeEqual, -1)
|
||||
if len(matches) > 0 {
|
||||
key = matches[len(matches)-1]
|
||||
}
|
||||
|
||||
if key == "Class" {
|
||||
return suggestClasses()
|
||||
}
|
||||
|
||||
container := Tree.GetNodeContaining(path, parser.Position{Line: params.Position.Line + 1, Column: col + 1})
|
||||
if container != nil {
|
||||
return suggestFieldValues(container, key, path)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// Case 2: Typing a key inside an object
|
||||
container := Tree.GetNodeContaining(path, parser.Position{Line: params.Position.Line + 1, Column: col + 1})
|
||||
if container != nil {
|
||||
return suggestFields(container)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func suggestClasses() *CompletionList {
|
||||
if GlobalSchema == nil {
|
||||
return nil
|
||||
}
|
||||
|
||||
classesVal := GlobalSchema.Value.LookupPath(cue.ParsePath("#Classes"))
|
||||
if classesVal.Err() != nil {
|
||||
return nil
|
||||
}
|
||||
|
||||
iter, err := classesVal.Fields()
|
||||
if err != nil {
|
||||
return nil
|
||||
}
|
||||
|
||||
var items []CompletionItem
|
||||
for iter.Next() {
|
||||
label := iter.Selector().String()
|
||||
label = strings.Trim(label, "?!#")
|
||||
|
||||
items = append(items, CompletionItem{
|
||||
Label: label,
|
||||
Kind: 7, // Class
|
||||
Detail: "MARTe Class",
|
||||
})
|
||||
}
|
||||
return &CompletionList{Items: items}
|
||||
}
|
||||
|
||||
func suggestFields(container *index.ProjectNode) *CompletionList {
|
||||
cls := container.Metadata["Class"]
|
||||
if cls == "" {
|
||||
return &CompletionList{Items: []CompletionItem{{
|
||||
Label: "Class",
|
||||
Kind: 10, // Property
|
||||
InsertText: "Class = ",
|
||||
Detail: "Define object class",
|
||||
}}}
|
||||
}
|
||||
|
||||
if GlobalSchema == nil {
|
||||
return nil
|
||||
}
|
||||
classPath := cue.ParsePath(fmt.Sprintf("#Classes.%s", cls))
|
||||
classVal := GlobalSchema.Value.LookupPath(classPath)
|
||||
if classVal.Err() != nil {
|
||||
return nil
|
||||
}
|
||||
|
||||
iter, err := classVal.Fields()
|
||||
if err != nil {
|
||||
return nil
|
||||
}
|
||||
|
||||
existing := make(map[string]bool)
|
||||
for _, frag := range container.Fragments {
|
||||
for _, def := range frag.Definitions {
|
||||
if f, ok := def.(*parser.Field); ok {
|
||||
existing[f.Name] = true
|
||||
}
|
||||
}
|
||||
}
|
||||
for name := range container.Children {
|
||||
existing[name] = true
|
||||
}
|
||||
|
||||
var items []CompletionItem
|
||||
for iter.Next() {
|
||||
label := iter.Selector().String()
|
||||
label = strings.Trim(label, "?!#")
|
||||
|
||||
// Skip if already present
|
||||
if existing[label] {
|
||||
continue
|
||||
}
|
||||
|
||||
isOptional := iter.IsOptional()
|
||||
kind := 10 // Property
|
||||
detail := "Mandatory"
|
||||
if isOptional {
|
||||
detail = "Optional"
|
||||
}
|
||||
|
||||
insertText := label + " = "
|
||||
val := iter.Value()
|
||||
if val.Kind() == cue.StructKind {
|
||||
// Suggest as node
|
||||
insertText = "+" + label + " = {\n\t$0\n}"
|
||||
kind = 9 // Module
|
||||
}
|
||||
|
||||
items = append(items, CompletionItem{
|
||||
Label: label,
|
||||
Kind: kind,
|
||||
Detail: detail,
|
||||
InsertText: insertText,
|
||||
InsertTextFormat: 2, // Snippet
|
||||
})
|
||||
}
|
||||
return &CompletionList{Items: items}
|
||||
}
|
||||
|
||||
func suggestFieldValues(container *index.ProjectNode, field string, path string) *CompletionList {
|
||||
var root *index.ProjectNode
|
||||
if iso, ok := Tree.IsolatedFiles[path]; ok {
|
||||
root = iso
|
||||
} else {
|
||||
root = Tree.Root
|
||||
}
|
||||
|
||||
if field == "DataSource" {
|
||||
return suggestObjects(root, "DataSource")
|
||||
}
|
||||
if field == "Functions" {
|
||||
return suggestObjects(root, "GAM")
|
||||
}
|
||||
if field == "Type" {
|
||||
return suggestSignalTypes()
|
||||
}
|
||||
|
||||
if list := suggestCUEEnums(container, field); list != nil {
|
||||
return list
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func suggestSignalTypes() *CompletionList {
|
||||
types := []string{
|
||||
"uint8", "int8", "uint16", "int16", "uint32", "int32", "uint64", "int64",
|
||||
"float32", "float64", "string", "bool", "char8",
|
||||
}
|
||||
var items []CompletionItem
|
||||
for _, t := range types {
|
||||
items = append(items, CompletionItem{
|
||||
Label: t,
|
||||
Kind: 13, // EnumMember
|
||||
Detail: "Signal Type",
|
||||
})
|
||||
}
|
||||
return &CompletionList{Items: items}
|
||||
}
|
||||
|
||||
func suggestCUEEnums(container *index.ProjectNode, field string) *CompletionList {
|
||||
if GlobalSchema == nil {
|
||||
return nil
|
||||
}
|
||||
cls := container.Metadata["Class"]
|
||||
if cls == "" {
|
||||
return nil
|
||||
}
|
||||
|
||||
classPath := cue.ParsePath(fmt.Sprintf("#Classes.%s.%s", cls, field))
|
||||
val := GlobalSchema.Value.LookupPath(classPath)
|
||||
if val.Err() != nil {
|
||||
return nil
|
||||
}
|
||||
|
||||
op, args := val.Expr()
|
||||
var values []cue.Value
|
||||
if op == cue.OrOp {
|
||||
values = args
|
||||
} else {
|
||||
values = []cue.Value{val}
|
||||
}
|
||||
|
||||
var items []CompletionItem
|
||||
for _, v := range values {
|
||||
if !v.IsConcrete() {
|
||||
continue
|
||||
}
|
||||
|
||||
str, err := v.String() // Returns quoted string for string values?
|
||||
if err != nil {
|
||||
continue
|
||||
}
|
||||
|
||||
// Ensure strings are quoted
|
||||
if v.Kind() == cue.StringKind && !strings.HasPrefix(str, "\"") {
|
||||
str = fmt.Sprintf("\"%s\"", str)
|
||||
}
|
||||
|
||||
items = append(items, CompletionItem{
|
||||
Label: str,
|
||||
Kind: 13, // EnumMember
|
||||
Detail: "Enum Value",
|
||||
})
|
||||
}
|
||||
|
||||
if len(items) > 0 {
|
||||
return &CompletionList{Items: items}
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func suggestObjects(root *index.ProjectNode, filter string) *CompletionList {
|
||||
if root == nil {
|
||||
return nil
|
||||
}
|
||||
var items []CompletionItem
|
||||
|
||||
var walk func(*index.ProjectNode)
|
||||
walk = func(node *index.ProjectNode) {
|
||||
match := false
|
||||
if filter == "GAM" {
|
||||
if isGAM(node) {
|
||||
match = true
|
||||
}
|
||||
} else if filter == "DataSource" {
|
||||
if isDataSource(node) {
|
||||
match = true
|
||||
}
|
||||
}
|
||||
|
||||
if match {
|
||||
items = append(items, CompletionItem{
|
||||
Label: node.Name,
|
||||
Kind: 6, // Variable
|
||||
Detail: node.Metadata["Class"],
|
||||
})
|
||||
}
|
||||
|
||||
for _, child := range node.Children {
|
||||
walk(child)
|
||||
}
|
||||
}
|
||||
|
||||
walk(root)
|
||||
return &CompletionList{Items: items}
|
||||
}
|
||||
|
||||
func isGAM(node *index.ProjectNode) bool {
|
||||
if node.RealName == "" || (node.RealName[0] != '+' && node.RealName[0] != '$') {
|
||||
return false
|
||||
}
|
||||
_, hasInput := node.Children["InputSignals"]
|
||||
_, hasOutput := node.Children["OutputSignals"]
|
||||
return hasInput || hasOutput
|
||||
}
|
||||
|
||||
func isDataSource(node *index.ProjectNode) bool {
|
||||
if node.Parent != nil && node.Parent.Name == "Data" {
|
||||
return true
|
||||
}
|
||||
_, hasSignals := node.Children["Signals"]
|
||||
return hasSignals
|
||||
}
|
||||
|
||||
func HandleDefinition(params DefinitionParams) any {
|
||||
path := uriToPath(params.TextDocument.URI)
|
||||
line := params.Position.Line + 1
|
||||
col := params.Position.Character + 1
|
||||
|
||||
res := tree.Query(path, line, col)
|
||||
res := Tree.Query(path, line, col)
|
||||
if res == nil {
|
||||
return nil
|
||||
}
|
||||
@@ -454,7 +869,11 @@ func handleDefinition(params DefinitionParams) any {
|
||||
if res.Reference != nil && res.Reference.Target != nil {
|
||||
targetNode = res.Reference.Target
|
||||
} else if res.Node != nil {
|
||||
targetNode = res.Node
|
||||
if res.Node.Target != nil {
|
||||
targetNode = res.Node.Target
|
||||
} else {
|
||||
targetNode = res.Node
|
||||
}
|
||||
}
|
||||
|
||||
if targetNode != nil {
|
||||
@@ -476,12 +895,12 @@ func handleDefinition(params DefinitionParams) any {
|
||||
return nil
|
||||
}
|
||||
|
||||
func handleReferences(params ReferenceParams) []Location {
|
||||
func HandleReferences(params ReferenceParams) []Location {
|
||||
path := uriToPath(params.TextDocument.URI)
|
||||
line := params.Position.Line + 1
|
||||
col := params.Position.Character + 1
|
||||
|
||||
res := tree.Query(path, line, col)
|
||||
res := Tree.Query(path, line, col)
|
||||
if res == nil {
|
||||
return nil
|
||||
}
|
||||
@@ -497,23 +916,30 @@ func handleReferences(params ReferenceParams) []Location {
|
||||
return nil
|
||||
}
|
||||
|
||||
// Resolve canonical target (follow link if present)
|
||||
canonical := targetNode
|
||||
if targetNode.Target != nil {
|
||||
canonical = targetNode.Target
|
||||
}
|
||||
|
||||
var locations []Location
|
||||
if params.Context.IncludeDeclaration {
|
||||
for _, frag := range targetNode.Fragments {
|
||||
for _, frag := range canonical.Fragments {
|
||||
if frag.IsObject {
|
||||
locations = append(locations, Location{
|
||||
URI: "file://" + frag.File,
|
||||
Range: Range{
|
||||
Start: Position{Line: frag.ObjectPos.Line - 1, Character: frag.ObjectPos.Column - 1},
|
||||
End: Position{Line: frag.ObjectPos.Line - 1, Character: frag.ObjectPos.Column - 1 + len(targetNode.RealName)},
|
||||
End: Position{Line: frag.ObjectPos.Line - 1, Character: frag.ObjectPos.Column - 1 + len(canonical.RealName)},
|
||||
},
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for _, ref := range tree.References {
|
||||
if ref.Target == targetNode {
|
||||
// 1. References from index (Aliases)
|
||||
for _, ref := range Tree.References {
|
||||
if ref.Target == canonical {
|
||||
locations = append(locations, Location{
|
||||
URI: "file://" + ref.File,
|
||||
Range: Range{
|
||||
@@ -524,17 +950,33 @@ func handleReferences(params ReferenceParams) []Location {
|
||||
}
|
||||
}
|
||||
|
||||
// 2. References from Node Targets (Direct References)
|
||||
Tree.Walk(func(node *index.ProjectNode) {
|
||||
if node.Target == canonical {
|
||||
for _, frag := range node.Fragments {
|
||||
if frag.IsObject {
|
||||
locations = append(locations, Location{
|
||||
URI: "file://" + frag.File,
|
||||
Range: Range{
|
||||
Start: Position{Line: frag.ObjectPos.Line - 1, Character: frag.ObjectPos.Column - 1},
|
||||
End: Position{Line: frag.ObjectPos.Line - 1, Character: frag.ObjectPos.Column - 1 + len(node.RealName)},
|
||||
},
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
return locations
|
||||
}
|
||||
|
||||
func formatNodeInfo(node *index.ProjectNode) string {
|
||||
class := node.Metadata["Class"]
|
||||
if class == "" {
|
||||
class = "Unknown"
|
||||
info := ""
|
||||
if class := node.Metadata["Class"]; class != "" {
|
||||
info = fmt.Sprintf("`%s:%s`\n\n", class, node.RealName[1:])
|
||||
} else {
|
||||
info = fmt.Sprintf("`%s`\n\n", node.RealName)
|
||||
}
|
||||
|
||||
info := fmt.Sprintf("**Object**: `%s`\n\n**Class**: `%s`", node.RealName, class)
|
||||
|
||||
// Check if it's a Signal (has Type or DataSource)
|
||||
typ := node.Metadata["Type"]
|
||||
ds := node.Metadata["DataSource"]
|
||||
@@ -549,8 +991,8 @@ func formatNodeInfo(node *index.ProjectNode) string {
|
||||
}
|
||||
|
||||
// Size
|
||||
dims := node.Metadata["NumberOfDimensions"]
|
||||
elems := node.Metadata["NumberOfElements"]
|
||||
dims := node.Metadata["NumberOfDimensions"]
|
||||
elems := node.Metadata["NumberOfElements"]
|
||||
if dims != "" || elems != "" {
|
||||
sigInfo += fmt.Sprintf("**Size**: `[%s]`, `%s` dims ", elems, dims)
|
||||
}
|
||||
@@ -560,6 +1002,57 @@ elems := node.Metadata["NumberOfElements"]
|
||||
if node.Doc != "" {
|
||||
info += fmt.Sprintf("\n\n%s", node.Doc)
|
||||
}
|
||||
|
||||
// Find references
|
||||
var refs []string
|
||||
for _, ref := range Tree.References {
|
||||
if ref.Target == node {
|
||||
container := Tree.GetNodeContaining(ref.File, ref.Position)
|
||||
if container != nil {
|
||||
threadName := ""
|
||||
stateName := ""
|
||||
|
||||
curr := container
|
||||
for curr != nil {
|
||||
if cls, ok := curr.Metadata["Class"]; ok {
|
||||
if cls == "RealTimeThread" {
|
||||
threadName = curr.RealName
|
||||
}
|
||||
if cls == "RealTimeState" {
|
||||
stateName = curr.RealName
|
||||
}
|
||||
}
|
||||
curr = curr.Parent
|
||||
}
|
||||
|
||||
if threadName != "" || stateName != "" {
|
||||
refStr := ""
|
||||
if stateName != "" {
|
||||
refStr += fmt.Sprintf("State: `%s`", stateName)
|
||||
}
|
||||
if threadName != "" {
|
||||
if refStr != "" {
|
||||
refStr += ", "
|
||||
}
|
||||
refStr += fmt.Sprintf("Thread: `%s`", threadName)
|
||||
}
|
||||
refs = append(refs, refStr)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if len(refs) > 0 {
|
||||
uniqueRefs := make(map[string]bool)
|
||||
info += "\n\n**Referenced in**:\n"
|
||||
for _, r := range refs {
|
||||
if !uniqueRefs[r] {
|
||||
uniqueRefs[r] = true
|
||||
info += fmt.Sprintf("- %s\n", r)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return info
|
||||
}
|
||||
|
||||
@@ -575,4 +1068,4 @@ func respond(id any, result any) {
|
||||
func send(msg any) {
|
||||
body, _ := json.Marshal(msg)
|
||||
fmt.Printf("Content-Length: %d\r\n\r\n%s", len(body), body)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,210 +0,0 @@
|
||||
package lsp
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
||||
)
|
||||
|
||||
func TestInitProjectScan(t *testing.T) {
|
||||
// 1. Setup temp dir with files
|
||||
tmpDir, err := os.MkdirTemp("", "lsp_test")
|
||||
if err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
defer os.RemoveAll(tmpDir)
|
||||
|
||||
// File 1: Definition
|
||||
if err := os.WriteFile(filepath.Join(tmpDir, "def.marte"), []byte("#package Test.Common\n+Target = { Class = C }"), 0644); err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
// File 2: Reference
|
||||
// +Source = { Class = C Link = Target }
|
||||
// Link = Target starts at index ...
|
||||
// #package Test.Common (21 chars including newline)
|
||||
// +Source = { Class = C Link = Target }
|
||||
// 012345678901234567890123456789012345
|
||||
// Previous offset was 29.
|
||||
// Now add 21?
|
||||
// #package Test.Common\n
|
||||
// +Source = ...
|
||||
// So add 21 to Character? Or Line 1?
|
||||
// It's on Line 1 (0-based 1).
|
||||
if err := os.WriteFile(filepath.Join(tmpDir, "ref.marte"), []byte("#package Test.Common\n+Source = { Class = C Link = Target }"), 0644); err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
|
||||
// 2. Initialize
|
||||
tree = index.NewProjectTree() // Reset global tree
|
||||
|
||||
initParams := InitializeParams{RootPath: tmpDir}
|
||||
paramsBytes, _ := json.Marshal(initParams)
|
||||
|
||||
msg := &JsonRpcMessage{
|
||||
Method: "initialize",
|
||||
Params: paramsBytes,
|
||||
ID: 1,
|
||||
}
|
||||
|
||||
handleMessage(msg)
|
||||
|
||||
// Query the reference in ref.marte at "Target"
|
||||
// Target starts at index 29 (0-based) on Line 1
|
||||
defParams := DefinitionParams{
|
||||
TextDocument: TextDocumentIdentifier{URI: "file://" + filepath.Join(tmpDir, "ref.marte")},
|
||||
Position: Position{Line: 1, Character: 29},
|
||||
}
|
||||
|
||||
res := handleDefinition(defParams)
|
||||
if res == nil {
|
||||
t.Fatal("Definition not found via LSP after initialization")
|
||||
}
|
||||
|
||||
locs, ok := res.([]Location)
|
||||
if !ok {
|
||||
t.Fatalf("Expected []Location, got %T", res)
|
||||
}
|
||||
|
||||
if len(locs) == 0 {
|
||||
t.Fatal("No locations found")
|
||||
}
|
||||
|
||||
// Verify uri points to def.marte
|
||||
expectedURI := "file://" + filepath.Join(tmpDir, "def.marte")
|
||||
if locs[0].URI != expectedURI {
|
||||
t.Errorf("Expected URI %s, got %s", expectedURI, locs[0].URI)
|
||||
}
|
||||
}
|
||||
|
||||
func TestHandleDefinition(t *testing.T) {
|
||||
// Reset tree for test
|
||||
tree = index.NewProjectTree()
|
||||
|
||||
content := `
|
||||
+MyObject = {
|
||||
Class = Type
|
||||
}
|
||||
+RefObject = {
|
||||
Class = Type
|
||||
RefField = MyObject
|
||||
}
|
||||
`
|
||||
path := "/test.marte"
|
||||
p := parser.NewParser(content)
|
||||
config, err := p.Parse()
|
||||
if err != nil {
|
||||
t.Fatalf("Parse failed: %v", err)
|
||||
}
|
||||
tree.AddFile(path, config)
|
||||
tree.ResolveReferences()
|
||||
|
||||
t.Logf("Refs: %d", len(tree.References))
|
||||
for _, r := range tree.References {
|
||||
t.Logf(" %s at %d:%d", r.Name, r.Position.Line, r.Position.Column)
|
||||
}
|
||||
|
||||
// Test Go to Definition on MyObject reference
|
||||
params := DefinitionParams{
|
||||
TextDocument: TextDocumentIdentifier{URI: "file://" + path},
|
||||
Position: Position{Line: 6, Character: 15}, // "MyObject" in RefField = MyObject
|
||||
}
|
||||
|
||||
result := handleDefinition(params)
|
||||
if result == nil {
|
||||
t.Fatal("handleDefinition returned nil")
|
||||
}
|
||||
|
||||
locations, ok := result.([]Location)
|
||||
if !ok {
|
||||
t.Fatalf("Expected []Location, got %T", result)
|
||||
}
|
||||
|
||||
if len(locations) != 1 {
|
||||
t.Fatalf("Expected 1 location, got %d", len(locations))
|
||||
}
|
||||
|
||||
if locations[0].Range.Start.Line != 1 { // +MyObject is on line 2 (0-indexed 1)
|
||||
t.Errorf("Expected definition on line 1, got %d", locations[0].Range.Start.Line)
|
||||
}
|
||||
}
|
||||
|
||||
func TestHandleReferences(t *testing.T) {
|
||||
// Reset tree for test
|
||||
tree = index.NewProjectTree()
|
||||
|
||||
content := `
|
||||
+MyObject = {
|
||||
Class = Type
|
||||
}
|
||||
+RefObject = {
|
||||
Class = Type
|
||||
RefField = MyObject
|
||||
}
|
||||
+AnotherRef = {
|
||||
Ref = MyObject
|
||||
}
|
||||
`
|
||||
path := "/test.marte"
|
||||
p := parser.NewParser(content)
|
||||
config, err := p.Parse()
|
||||
if err != nil {
|
||||
t.Fatalf("Parse failed: %v", err)
|
||||
}
|
||||
tree.AddFile(path, config)
|
||||
tree.ResolveReferences()
|
||||
|
||||
// Test Find References for MyObject (triggered from its definition)
|
||||
params := ReferenceParams{
|
||||
TextDocument: TextDocumentIdentifier{URI: "file://" + path},
|
||||
Position: Position{Line: 1, Character: 1}, // "+MyObject"
|
||||
Context: ReferenceContext{IncludeDeclaration: true},
|
||||
}
|
||||
|
||||
locations := handleReferences(params)
|
||||
if len(locations) != 3 { // 1 declaration + 2 references
|
||||
t.Fatalf("Expected 3 locations, got %d", len(locations))
|
||||
}
|
||||
}
|
||||
|
||||
func TestLSPFormatting(t *testing.T) {
|
||||
// Setup
|
||||
content := `
|
||||
#package Proj.Main
|
||||
+Object={
|
||||
Field=1
|
||||
}
|
||||
`
|
||||
uri := "file:///test.marte"
|
||||
|
||||
// Open (populate documents map)
|
||||
documents[uri] = content
|
||||
|
||||
// Format
|
||||
params := DocumentFormattingParams{
|
||||
TextDocument: TextDocumentIdentifier{URI: uri},
|
||||
}
|
||||
|
||||
edits := handleFormatting(params)
|
||||
|
||||
if len(edits) != 1 {
|
||||
t.Fatalf("Expected 1 edit, got %d", len(edits))
|
||||
}
|
||||
|
||||
newText := edits[0].NewText
|
||||
|
||||
expected := `#package Proj.Main
|
||||
|
||||
+Object = {
|
||||
Field = 1
|
||||
}
|
||||
`
|
||||
// Normalize newlines for comparison just in case
|
||||
if strings.TrimSpace(strings.ReplaceAll(newText, "\r\n", "\n")) != strings.TrimSpace(strings.ReplaceAll(expected, "\r\n", "\n")) {
|
||||
t.Errorf("Formatting mismatch.\nExpected:\n%s\nGot:\n%s", expected, newText)
|
||||
}
|
||||
}
|
||||
@@ -22,6 +22,7 @@ const (
|
||||
TokenPragma
|
||||
TokenComment
|
||||
TokenDocstring
|
||||
TokenComma
|
||||
)
|
||||
|
||||
type Token struct {
|
||||
@@ -121,6 +122,8 @@ func (l *Lexer) NextToken() Token {
|
||||
return l.emit(TokenLBrace)
|
||||
case '}':
|
||||
return l.emit(TokenRBrace)
|
||||
case ',':
|
||||
return l.emit(TokenComma)
|
||||
case '"':
|
||||
return l.lexString()
|
||||
case '/':
|
||||
@@ -148,7 +151,7 @@ func (l *Lexer) NextToken() Token {
|
||||
func (l *Lexer) lexIdentifier() Token {
|
||||
for {
|
||||
r := l.next()
|
||||
if unicode.IsLetter(r) || unicode.IsDigit(r) || r == '_' || r == '-' {
|
||||
if unicode.IsLetter(r) || unicode.IsDigit(r) || r == '_' || r == '-' || r == '.' || r == ':' {
|
||||
continue
|
||||
}
|
||||
l.backup()
|
||||
@@ -186,7 +189,7 @@ func (l *Lexer) lexString() Token {
|
||||
func (l *Lexer) lexNumber() Token {
|
||||
for {
|
||||
r := l.next()
|
||||
if unicode.IsDigit(r) || r == '.' || r == 'x' || r == 'b' || r == 'e' || r == '-' {
|
||||
if unicode.IsDigit(r) || unicode.IsLetter(r) || r == '.' || r == '-' || r == '+' {
|
||||
continue
|
||||
}
|
||||
l.backup()
|
||||
@@ -206,6 +209,20 @@ func (l *Lexer) lexComment() Token {
|
||||
}
|
||||
return l.lexUntilNewline(TokenComment)
|
||||
}
|
||||
if r == '*' {
|
||||
for {
|
||||
r := l.next()
|
||||
if r == -1 {
|
||||
return l.emit(TokenError)
|
||||
}
|
||||
if r == '*' {
|
||||
if l.peek() == '/' {
|
||||
l.next() // consume /
|
||||
return l.emit(TokenComment)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
l.backup()
|
||||
return l.emit(TokenError)
|
||||
}
|
||||
@@ -240,4 +257,4 @@ func (l *Lexer) lexPackage() Token {
|
||||
return l.lexUntilNewline(TokenPackage)
|
||||
}
|
||||
return l.emit(TokenError)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -11,6 +11,7 @@ type Parser struct {
|
||||
buf []Token
|
||||
comments []Comment
|
||||
pragmas []Pragma
|
||||
errors []error
|
||||
}
|
||||
|
||||
func NewParser(input string) *Parser {
|
||||
@@ -19,6 +20,10 @@ func NewParser(input string) *Parser {
|
||||
}
|
||||
}
|
||||
|
||||
func (p *Parser) addError(pos Position, msg string) {
|
||||
p.errors = append(p.errors, fmt.Errorf("%d:%d: %s", pos.Line, pos.Column, msg))
|
||||
}
|
||||
|
||||
func (p *Parser) next() Token {
|
||||
if len(p.buf) > 0 {
|
||||
t := p.buf[0]
|
||||
@@ -71,72 +76,82 @@ func (p *Parser) Parse() (*Configuration, error) {
|
||||
continue
|
||||
}
|
||||
|
||||
def, err := p.parseDefinition()
|
||||
if err != nil {
|
||||
return nil, err
|
||||
def, ok := p.parseDefinition()
|
||||
if ok {
|
||||
config.Definitions = append(config.Definitions, def)
|
||||
} else {
|
||||
// Synchronization: skip token if not consumed to make progress
|
||||
if p.peek() == tok {
|
||||
p.next()
|
||||
}
|
||||
}
|
||||
config.Definitions = append(config.Definitions, def)
|
||||
}
|
||||
config.Comments = p.comments
|
||||
config.Pragmas = p.pragmas
|
||||
return config, nil
|
||||
|
||||
var err error
|
||||
if len(p.errors) > 0 {
|
||||
err = p.errors[0]
|
||||
}
|
||||
return config, err
|
||||
}
|
||||
|
||||
func (p *Parser) parseDefinition() (Definition, error) {
|
||||
func (p *Parser) parseDefinition() (Definition, bool) {
|
||||
tok := p.next()
|
||||
switch tok.Type {
|
||||
case TokenIdentifier:
|
||||
// Could be Field = Value OR Node = { ... }
|
||||
name := tok.Value
|
||||
if p.next().Type != TokenEqual {
|
||||
return nil, fmt.Errorf("%d:%d: expected =", tok.Position.Line, tok.Position.Column)
|
||||
if p.peek().Type != TokenEqual {
|
||||
p.addError(tok.Position, "expected =")
|
||||
return nil, false
|
||||
}
|
||||
p.next() // Consume =
|
||||
|
||||
// Disambiguate based on RHS
|
||||
nextTok := p.peek()
|
||||
if nextTok.Type == TokenLBrace {
|
||||
// Check if it looks like a Subnode (contains definitions) or Array (contains values)
|
||||
if p.isSubnodeLookahead() {
|
||||
sub, err := p.parseSubnode()
|
||||
if err != nil {
|
||||
return nil, err
|
||||
sub, ok := p.parseSubnode()
|
||||
if !ok {
|
||||
return nil, false
|
||||
}
|
||||
return &ObjectNode{
|
||||
Position: tok.Position,
|
||||
Name: name,
|
||||
Subnode: sub,
|
||||
}, nil
|
||||
}, true
|
||||
}
|
||||
}
|
||||
|
||||
// Default to Field
|
||||
val, err := p.parseValue()
|
||||
if err != nil {
|
||||
return nil, err
|
||||
val, ok := p.parseValue()
|
||||
if !ok {
|
||||
return nil, false
|
||||
}
|
||||
return &Field{
|
||||
Position: tok.Position,
|
||||
Name: name,
|
||||
Value: val,
|
||||
}, nil
|
||||
}, true
|
||||
|
||||
case TokenObjectIdentifier:
|
||||
// node = subnode
|
||||
name := tok.Value
|
||||
if p.next().Type != TokenEqual {
|
||||
return nil, fmt.Errorf("%d:%d: expected =", tok.Position.Line, tok.Position.Column)
|
||||
if p.peek().Type != TokenEqual {
|
||||
p.addError(tok.Position, "expected =")
|
||||
return nil, false
|
||||
}
|
||||
sub, err := p.parseSubnode()
|
||||
if err != nil {
|
||||
return nil, err
|
||||
p.next() // Consume =
|
||||
|
||||
sub, ok := p.parseSubnode()
|
||||
if !ok {
|
||||
return nil, false
|
||||
}
|
||||
return &ObjectNode{
|
||||
Position: tok.Position,
|
||||
Name: name,
|
||||
Subnode: sub,
|
||||
}, nil
|
||||
}, true
|
||||
default:
|
||||
return nil, fmt.Errorf("%d:%d: unexpected token %v", tok.Position.Line, tok.Position.Column, tok.Value)
|
||||
p.addError(tok.Position, fmt.Sprintf("unexpected token %v", tok.Value))
|
||||
return nil, false
|
||||
}
|
||||
}
|
||||
|
||||
@@ -145,17 +160,17 @@ func (p *Parser) isSubnodeLookahead() bool {
|
||||
// Look inside:
|
||||
// peek(0) is '{'
|
||||
// peek(1) is first token inside
|
||||
|
||||
|
||||
t1 := p.peekN(1)
|
||||
if t1.Type == TokenRBrace {
|
||||
// {} -> Empty. Assume Array (Value) by default, unless forced?
|
||||
// {} -> Empty. Assume Array (Value) by default, unless forced?
|
||||
// If we return false, it parses as ArrayValue.
|
||||
// If user writes "Sig = {}", is it an empty signal?
|
||||
// Empty array is more common for value.
|
||||
// Empty array is more common for value.
|
||||
// If "Sig" is a node, it should probably have content or use +Sig.
|
||||
return false
|
||||
return false
|
||||
}
|
||||
|
||||
|
||||
if t1.Type == TokenIdentifier {
|
||||
// Identifier inside.
|
||||
// If followed by '=', it's a definition -> Subnode.
|
||||
@@ -166,20 +181,21 @@ func (p *Parser) isSubnodeLookahead() bool {
|
||||
// Identifier alone or followed by something else -> Reference/Value -> Array
|
||||
return false
|
||||
}
|
||||
|
||||
|
||||
if t1.Type == TokenObjectIdentifier {
|
||||
// +Node = ... -> Definition -> Subnode
|
||||
return true
|
||||
}
|
||||
|
||||
|
||||
// Literals -> Array
|
||||
return false
|
||||
}
|
||||
|
||||
func (p *Parser) parseSubnode() (Subnode, error) {
|
||||
func (p *Parser) parseSubnode() (Subnode, bool) {
|
||||
tok := p.next()
|
||||
if tok.Type != TokenLBrace {
|
||||
return Subnode{}, fmt.Errorf("%d:%d: expected {", tok.Position.Line, tok.Position.Column)
|
||||
p.addError(tok.Position, "expected {")
|
||||
return Subnode{}, false
|
||||
}
|
||||
sub := Subnode{Position: tok.Position}
|
||||
for {
|
||||
@@ -190,43 +206,45 @@ func (p *Parser) parseSubnode() (Subnode, error) {
|
||||
break
|
||||
}
|
||||
if t.Type == TokenEOF {
|
||||
return sub, fmt.Errorf("%d:%d: unexpected EOF, expected }", t.Position.Line, t.Position.Column)
|
||||
p.addError(t.Position, "unexpected EOF, expected }")
|
||||
sub.EndPosition = t.Position
|
||||
return sub, true
|
||||
}
|
||||
def, err := p.parseDefinition()
|
||||
if err != nil {
|
||||
return sub, err
|
||||
def, ok := p.parseDefinition()
|
||||
if ok {
|
||||
sub.Definitions = append(sub.Definitions, def)
|
||||
} else {
|
||||
if p.peek() == t {
|
||||
p.next()
|
||||
}
|
||||
}
|
||||
sub.Definitions = append(sub.Definitions, def)
|
||||
}
|
||||
return sub, nil
|
||||
return sub, true
|
||||
}
|
||||
|
||||
func (p *Parser) parseValue() (Value, error) {
|
||||
func (p *Parser) parseValue() (Value, bool) {
|
||||
tok := p.next()
|
||||
switch tok.Type {
|
||||
case TokenString:
|
||||
return &StringValue{
|
||||
Position: tok.Position,
|
||||
Value: strings.Trim(tok.Value, "\""),
|
||||
Quoted: true,
|
||||
}, nil
|
||||
|
||||
case TokenString:
|
||||
return &StringValue{
|
||||
Position: tok.Position,
|
||||
Value: strings.Trim(tok.Value, "\""),
|
||||
Quoted: true,
|
||||
}, true
|
||||
|
||||
case TokenNumber:
|
||||
// Simplistic handling
|
||||
if strings.Contains(tok.Value, ".") || strings.Contains(tok.Value, "e") {
|
||||
f, _ := strconv.ParseFloat(tok.Value, 64)
|
||||
return &FloatValue{Position: tok.Position, Value: f, Raw: tok.Value}, nil
|
||||
return &FloatValue{Position: tok.Position, Value: f, Raw: tok.Value}, true
|
||||
}
|
||||
i, _ := strconv.ParseInt(tok.Value, 0, 64)
|
||||
return &IntValue{Position: tok.Position, Value: i, Raw: tok.Value}, nil
|
||||
return &IntValue{Position: tok.Position, Value: i, Raw: tok.Value}, true
|
||||
case TokenBool:
|
||||
return &BoolValue{Position: tok.Position, Value: tok.Value == "true"},
|
||||
nil
|
||||
true
|
||||
case TokenIdentifier:
|
||||
// reference?
|
||||
return &ReferenceValue{Position: tok.Position, Value: tok.Value}, nil
|
||||
return &ReferenceValue{Position: tok.Position, Value: tok.Value}, true
|
||||
case TokenLBrace:
|
||||
// array
|
||||
arr := &ArrayValue{Position: tok.Position}
|
||||
for {
|
||||
t := p.peek()
|
||||
@@ -235,14 +253,19 @@ func (p *Parser) parseValue() (Value, error) {
|
||||
arr.EndPosition = endTok.Position
|
||||
break
|
||||
}
|
||||
val, err := p.parseValue()
|
||||
if err != nil {
|
||||
return nil, err
|
||||
if t.Type == TokenComma {
|
||||
p.next()
|
||||
continue
|
||||
}
|
||||
val, ok := p.parseValue()
|
||||
if !ok {
|
||||
return nil, false
|
||||
}
|
||||
arr.Elements = append(arr.Elements, val)
|
||||
}
|
||||
return arr, nil
|
||||
return arr, true
|
||||
default:
|
||||
return nil, fmt.Errorf("%d:%d: unexpected value token %v", tok.Position.Line, tok.Position.Column, tok.Value)
|
||||
p.addError(tok.Position, fmt.Sprintf("unexpected value token %v", tok.Value))
|
||||
return nil, false
|
||||
}
|
||||
}
|
||||
|
||||
297
internal/schema/marte.cue
Normal file
297
internal/schema/marte.cue
Normal file
@@ -0,0 +1,297 @@
|
||||
package schema
|
||||
|
||||
#Classes: {
|
||||
RealTimeApplication: {
|
||||
Functions: {...} // type: node
|
||||
Data!: {...} // type: node
|
||||
States!: {...} // type: node
|
||||
...
|
||||
}
|
||||
Message: {
|
||||
...
|
||||
}
|
||||
StateMachineEvent: {
|
||||
NextState!: string
|
||||
NextStateError!: string
|
||||
Timeout: uint32
|
||||
[_= !~"^(Class|NextState|Timeout|NextStateError|[#_$].+)$"]: Message
|
||||
...
|
||||
}
|
||||
_State: {
|
||||
Class: "ReferenceContainer"
|
||||
ENTER?: {
|
||||
Class: "ReferenceContainer"
|
||||
...
|
||||
}
|
||||
[_ = !~"^(Class|ENTER)$"]: StateMachineEvent
|
||||
...
|
||||
}
|
||||
StateMachine: {
|
||||
[_ = !~"^(Class|[$].*)$"]: _State
|
||||
...
|
||||
}
|
||||
RealTimeState: {
|
||||
Threads: {...} // type: node
|
||||
...
|
||||
}
|
||||
RealTimeThread: {
|
||||
Functions: [...] // type: array
|
||||
...
|
||||
}
|
||||
GAMScheduler: {
|
||||
TimingDataSource: string // type: reference
|
||||
...
|
||||
}
|
||||
TimingDataSource: {
|
||||
direction: "IN"
|
||||
...
|
||||
}
|
||||
IOGAM: {
|
||||
InputSignals?: {...} // type: node
|
||||
OutputSignals?: {...} // type: node
|
||||
...
|
||||
}
|
||||
ReferenceContainer: {
|
||||
...
|
||||
}
|
||||
ConstantGAM: {
|
||||
...
|
||||
}
|
||||
PIDGAM: {
|
||||
Kp: float | int // type: float (allow int as it promotes)
|
||||
Ki: float | int
|
||||
Kd: float | int
|
||||
...
|
||||
}
|
||||
FileDataSource: {
|
||||
Filename: string
|
||||
Format?: string
|
||||
direction: "INOUT"
|
||||
...
|
||||
}
|
||||
LoggerDataSource: {
|
||||
direction: "OUT"
|
||||
...
|
||||
}
|
||||
DANStream: {
|
||||
Timeout?: int
|
||||
direction: "OUT"
|
||||
...
|
||||
}
|
||||
EPICSCAInput: {
|
||||
direction: "IN"
|
||||
...
|
||||
}
|
||||
EPICSCAOutput: {
|
||||
direction: "OUT"
|
||||
...
|
||||
}
|
||||
EPICSPVAInput: {
|
||||
direction: "IN"
|
||||
...
|
||||
}
|
||||
EPICSPVAOutput: {
|
||||
direction: "OUT"
|
||||
...
|
||||
}
|
||||
SDNSubscriber: {
|
||||
Address: string
|
||||
Port: int
|
||||
Interface?: string
|
||||
direction: "IN"
|
||||
...
|
||||
}
|
||||
SDNPublisher: {
|
||||
Address: string
|
||||
Port: int
|
||||
Interface?: string
|
||||
direction: "OUT"
|
||||
...
|
||||
}
|
||||
UDPReceiver: {
|
||||
Port: int
|
||||
Address?: string
|
||||
direction: "IN"
|
||||
...
|
||||
}
|
||||
UDPSender: {
|
||||
Destination: string
|
||||
direction: "OUT"
|
||||
...
|
||||
}
|
||||
FileReader: {
|
||||
Filename: string
|
||||
Format?: string
|
||||
Interpolate?: string
|
||||
direction: "IN"
|
||||
...
|
||||
}
|
||||
FileWriter: {
|
||||
Filename: string
|
||||
Format?: string
|
||||
StoreOnTrigger?: int
|
||||
direction: "OUT"
|
||||
...
|
||||
}
|
||||
OrderedClass: {
|
||||
First: int
|
||||
Second: string
|
||||
...
|
||||
}
|
||||
BaseLib2GAM: {...}
|
||||
ConversionGAM: {...}
|
||||
DoubleHandshakeGAM: {...}
|
||||
FilterGAM: {
|
||||
Num: [...]
|
||||
Den: [...]
|
||||
ResetInEachState?: _
|
||||
InputSignals?: {...}
|
||||
OutputSignals?: {...}
|
||||
...
|
||||
}
|
||||
HistogramGAM: {
|
||||
BeginCycleNumber?: int
|
||||
StateChangeResetName?: string
|
||||
InputSignals?: {...}
|
||||
OutputSignals?: {...}
|
||||
...
|
||||
}
|
||||
Interleaved2FlatGAM: {...}
|
||||
FlattenedStructIOGAM: {...}
|
||||
MathExpressionGAM: {
|
||||
Expression: string
|
||||
InputSignals?: {...}
|
||||
OutputSignals?: {...}
|
||||
...
|
||||
}
|
||||
MessageGAM: {...}
|
||||
MuxGAM: {...}
|
||||
SimulinkWrapperGAM: {...}
|
||||
SSMGAM: {...}
|
||||
StatisticsGAM: {...}
|
||||
TimeCorrectionGAM: {...}
|
||||
TriggeredIOGAM: {...}
|
||||
WaveformGAM: {...}
|
||||
DAN: {
|
||||
direction: "OUT"
|
||||
...
|
||||
}
|
||||
LinuxTimer: {
|
||||
ExecutionMode?: string
|
||||
SleepNature?: string
|
||||
SleepPercentage?: _
|
||||
Phase?: int
|
||||
CPUMask?: int
|
||||
TimeProvider?: {...}
|
||||
Signals: {...}
|
||||
direction: "IN"
|
||||
...
|
||||
}
|
||||
LinkDataSource: {
|
||||
direction: "INOUT"
|
||||
...
|
||||
}
|
||||
MDSReader: {
|
||||
TreeName: string
|
||||
ShotNumber: int
|
||||
Frequency: float | int
|
||||
Signals: {...}
|
||||
direction: "IN"
|
||||
...
|
||||
}
|
||||
MDSWriter: {
|
||||
NumberOfBuffers: int
|
||||
CPUMask: int
|
||||
StackSize: int
|
||||
TreeName: string
|
||||
PulseNumber?: int
|
||||
StoreOnTrigger: int
|
||||
EventName: string
|
||||
TimeRefresh: float | int
|
||||
NumberOfPreTriggers?: int
|
||||
NumberOfPostTriggers?: int
|
||||
Signals: {...}
|
||||
Messages?: {...}
|
||||
direction: "OUT"
|
||||
...
|
||||
}
|
||||
NI1588TimeStamp: {
|
||||
direction: "IN"
|
||||
...
|
||||
}
|
||||
NI6259ADC: {
|
||||
direction: "IN"
|
||||
...
|
||||
}
|
||||
NI6259DAC: {
|
||||
direction: "OUT"
|
||||
...
|
||||
}
|
||||
NI6259DIO: {
|
||||
direction: "INOUT"
|
||||
...
|
||||
}
|
||||
NI6368ADC: {
|
||||
direction: "IN"
|
||||
...
|
||||
}
|
||||
NI6368DAC: {
|
||||
direction: "OUT"
|
||||
...
|
||||
}
|
||||
NI6368DIO: {
|
||||
direction: "INOUT"
|
||||
...
|
||||
}
|
||||
NI9157CircularFifoReader: {
|
||||
direction: "IN"
|
||||
...
|
||||
}
|
||||
NI9157MxiDataSource: {
|
||||
direction: "INOUT"
|
||||
...
|
||||
}
|
||||
OPCUADSInput: {
|
||||
direction: "IN"
|
||||
...
|
||||
}
|
||||
OPCUADSOutput: {
|
||||
direction: "OUT"
|
||||
...
|
||||
}
|
||||
RealTimeThreadAsyncBridge: {...}
|
||||
RealTimeThreadSynchronisation: {...}
|
||||
UARTDataSource: {
|
||||
direction: "INOUT"
|
||||
...
|
||||
}
|
||||
BaseLib2Wrapper: {...}
|
||||
EPICSCAClient: {...}
|
||||
EPICSPVA: {...}
|
||||
MemoryGate: {...}
|
||||
OPCUA: {...}
|
||||
SysLogger: {...}
|
||||
GAMDataSource: {
|
||||
direction: "INOUT"
|
||||
...
|
||||
}
|
||||
}
|
||||
|
||||
// Definition for any Object.
|
||||
// It must have a Class field.
|
||||
// Based on Class, it validates against #Classes.
|
||||
#Object: {
|
||||
Class: string
|
||||
// Allow any other field by default (extensibility),
|
||||
// unless #Classes definition is closed.
|
||||
// We allow open structs now.
|
||||
...
|
||||
|
||||
// Unify if Class is known.
|
||||
// If Class is NOT in #Classes, this might fail or do nothing depending on CUE logic.
|
||||
// Actually, `#Classes[Class]` fails if key is missing.
|
||||
// This ensures we validate against known classes.
|
||||
// If we want to allow unknown classes, we need a check.
|
||||
// But spec implies validation should check known classes.
|
||||
#Classes[Class]
|
||||
}
|
||||
@@ -1,209 +0,0 @@
|
||||
{
|
||||
"classes": {
|
||||
"RealTimeApplication": {
|
||||
"fields": [
|
||||
{"name": "Functions", "type": "node", "mandatory": true},
|
||||
{"name": "Data", "type": "node", "mandatory": true},
|
||||
{"name": "States", "type": "node", "mandatory": true}
|
||||
]
|
||||
},
|
||||
"StateMachine": {
|
||||
"fields": [
|
||||
{"name": "States", "type": "node", "mandatory": true}
|
||||
]
|
||||
},
|
||||
"GAMScheduler": {
|
||||
"fields": [
|
||||
{"name": "TimingDataSource", "type": "reference", "mandatory": true}
|
||||
]
|
||||
},
|
||||
"TimingDataSource": {
|
||||
"fields": []
|
||||
},
|
||||
"IOGAM": {
|
||||
"fields": [
|
||||
{"name": "InputSignals", "type": "node", "mandatory": false},
|
||||
{"name": "OutputSignals", "type": "node", "mandatory": false}
|
||||
]
|
||||
},
|
||||
"ReferenceContainer": {
|
||||
"fields": []
|
||||
},
|
||||
"ConstantGAM": {
|
||||
"fields": []
|
||||
},
|
||||
"PIDGAM": {
|
||||
"fields": [
|
||||
{"name": "Kp", "type": "float", "mandatory": true},
|
||||
{"name": "Ki", "type": "float", "mandatory": true},
|
||||
{"name": "Kd", "type": "float", "mandatory": true}
|
||||
]
|
||||
},
|
||||
"FileDataSource": {
|
||||
"fields": [
|
||||
{"name": "Filename", "type": "string", "mandatory": true},
|
||||
{"name": "Format", "type": "string", "mandatory": false}
|
||||
]
|
||||
},
|
||||
"LoggerDataSource": {
|
||||
"fields": []
|
||||
},
|
||||
"DANStream": {
|
||||
"fields": [
|
||||
{"name": "Timeout", "type": "int", "mandatory": false}
|
||||
]
|
||||
},
|
||||
"EPICSCAInput": {
|
||||
"fields": []
|
||||
},
|
||||
"EPICSCAOutput": {
|
||||
"fields": []
|
||||
},
|
||||
"EPICSPVAInput": {
|
||||
"fields": []
|
||||
},
|
||||
"EPICSPVAOutput": {
|
||||
"fields": []
|
||||
},
|
||||
"SDNSubscriber": {
|
||||
"fields": [
|
||||
{"name": "Address", "type": "string", "mandatory": true},
|
||||
{"name": "Port", "type": "int", "mandatory": true},
|
||||
{"name": "Interface", "type": "string", "mandatory": false}
|
||||
]
|
||||
},
|
||||
"SDNPublisher": {
|
||||
"fields": [
|
||||
{"name": "Address", "type": "string", "mandatory": true},
|
||||
{"name": "Port", "type": "int", "mandatory": true},
|
||||
{"name": "Interface", "type": "string", "mandatory": false}
|
||||
]
|
||||
},
|
||||
"UDPReceiver": {
|
||||
"fields": [
|
||||
{"name": "Port", "type": "int", "mandatory": true},
|
||||
{"name": "Address", "type": "string", "mandatory": false}
|
||||
]
|
||||
},
|
||||
"UDPSender": {
|
||||
"fields": [
|
||||
{"name": "Destination", "type": "string", "mandatory": true}
|
||||
]
|
||||
},
|
||||
"FileReader": {
|
||||
"fields": [
|
||||
{"name": "Filename", "type": "string", "mandatory": true},
|
||||
{"name": "Format", "type": "string", "mandatory": false},
|
||||
{"name": "Interpolate", "type": "string", "mandatory": false}
|
||||
]
|
||||
},
|
||||
"FileWriter": {
|
||||
"fields": [
|
||||
{"name": "Filename", "type": "string", "mandatory": true},
|
||||
{"name": "Format", "type": "string", "mandatory": false},
|
||||
{"name": "StoreOnTrigger", "type": "int", "mandatory": false}
|
||||
]
|
||||
},
|
||||
"OrderedClass": {
|
||||
"ordered": true,
|
||||
"fields": [
|
||||
{"name": "First", "type": "int", "mandatory": true},
|
||||
{"name": "Second", "type": "string", "mandatory": true}
|
||||
]
|
||||
},
|
||||
"BaseLib2GAM": { "fields": [] },
|
||||
"ConversionGAM": { "fields": [] },
|
||||
"DoubleHandshakeGAM": { "fields": [] },
|
||||
"FilterGAM": {
|
||||
"fields": [
|
||||
{"name": "Num", "type": "array", "mandatory": true},
|
||||
{"name": "Den", "type": "array", "mandatory": true},
|
||||
{"name": "ResetInEachState", "type": "any", "mandatory": false},
|
||||
{"name": "InputSignals", "type": "node", "mandatory": false},
|
||||
{"name": "OutputSignals", "type": "node", "mandatory": false}
|
||||
]
|
||||
},
|
||||
"HistogramGAM": {
|
||||
"fields": [
|
||||
{"name": "BeginCycleNumber", "type": "int", "mandatory": false},
|
||||
{"name": "StateChangeResetName", "type": "string", "mandatory": false},
|
||||
{"name": "InputSignals", "type": "node", "mandatory": false},
|
||||
{"name": "OutputSignals", "type": "node", "mandatory": false}
|
||||
]
|
||||
},
|
||||
"Interleaved2FlatGAM": { "fields": [] },
|
||||
"FlattenedStructIOGAM": { "fields": [] },
|
||||
"MathExpressionGAM": {
|
||||
"fields": [
|
||||
{"name": "Expression", "type": "string", "mandatory": true},
|
||||
{"name": "InputSignals", "type": "node", "mandatory": false},
|
||||
{"name": "OutputSignals", "type": "node", "mandatory": false}
|
||||
]
|
||||
},
|
||||
"MessageGAM": { "fields": [] },
|
||||
"MuxGAM": { "fields": [] },
|
||||
"SimulinkWrapperGAM": { "fields": [] },
|
||||
"SSMGAM": { "fields": [] },
|
||||
"StatisticsGAM": { "fields": [] },
|
||||
"TimeCorrectionGAM": { "fields": [] },
|
||||
"TriggeredIOGAM": { "fields": [] },
|
||||
"WaveformGAM": { "fields": [] },
|
||||
"DAN": { "fields": [] },
|
||||
"LinuxTimer": {
|
||||
"fields": [
|
||||
{"name": "ExecutionMode", "type": "string", "mandatory": false},
|
||||
{"name": "SleepNature", "type": "string", "mandatory": false},
|
||||
{"name": "SleepPercentage", "type": "any", "mandatory": false},
|
||||
{"name": "Phase", "type": "int", "mandatory": false},
|
||||
{"name": "CPUMask", "type": "int", "mandatory": false},
|
||||
{"name": "TimeProvider", "type": "node", "mandatory": false},
|
||||
{"name": "Signals", "type": "node", "mandatory": true}
|
||||
]
|
||||
},
|
||||
"LinkDataSource": { "fields": [] },
|
||||
"MDSReader": {
|
||||
"fields": [
|
||||
{"name": "TreeName", "type": "string", "mandatory": true},
|
||||
{"name": "ShotNumber", "type": "int", "mandatory": true},
|
||||
{"name": "Frequency", "type": "float", "mandatory": true},
|
||||
{"name": "Signals", "type": "node", "mandatory": true}
|
||||
]
|
||||
},
|
||||
"MDSWriter": {
|
||||
"fields": [
|
||||
{"name": "NumberOfBuffers", "type": "int", "mandatory": true},
|
||||
{"name": "CPUMask", "type": "int", "mandatory": true},
|
||||
{"name": "StackSize", "type": "int", "mandatory": true},
|
||||
{"name": "TreeName", "type": "string", "mandatory": true},
|
||||
{"name": "PulseNumber", "type": "int", "mandatory": false},
|
||||
{"name": "StoreOnTrigger", "type": "int", "mandatory": true},
|
||||
{"name": "EventName", "type": "string", "mandatory": true},
|
||||
{"name": "TimeRefresh", "type": "float", "mandatory": true},
|
||||
{"name": "NumberOfPreTriggers", "type": "int", "mandatory": false},
|
||||
{"name": "NumberOfPostTriggers", "type": "int", "mandatory": false},
|
||||
{"name": "Signals", "type": "node", "mandatory": true},
|
||||
{"name": "Messages", "type": "node", "mandatory": false}
|
||||
]
|
||||
},
|
||||
"NI1588TimeStamp": { "fields": [] },
|
||||
"NI6259ADC": { "fields": [] },
|
||||
"NI6259DAC": { "fields": [] },
|
||||
"NI6259DIO": { "fields": [] },
|
||||
"NI6368ADC": { "fields": [] },
|
||||
"NI6368DAC": { "fields": [] },
|
||||
"NI6368DIO": { "fields": [] },
|
||||
"NI9157CircularFifoReader": { "fields": [] },
|
||||
"NI9157MxiDataSource": { "fields": [] },
|
||||
"OPCUADSInput": { "fields": [] },
|
||||
"OPCUADSOutput": { "fields": [] },
|
||||
"RealTimeThreadAsyncBridge": { "fields": [] },
|
||||
"RealTimeThreadSynchronisation": { "fields": [] },
|
||||
"UARTDataSource": { "fields": [] },
|
||||
"BaseLib2Wrapper": { "fields": [] },
|
||||
"EPICSCAClient": { "fields": [] },
|
||||
"EPICSPVA": { "fields": [] },
|
||||
"MemoryGate": { "fields": [] },
|
||||
"OPCUA": { "fields": [] },
|
||||
"SysLogger": { "fields": [] }
|
||||
}
|
||||
}
|
||||
@@ -2,133 +2,73 @@ package schema
|
||||
|
||||
import (
|
||||
_ "embed"
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"os"
|
||||
"path/filepath"
|
||||
|
||||
"cuelang.org/go/cue"
|
||||
"cuelang.org/go/cue/cuecontext"
|
||||
)
|
||||
|
||||
//go:embed marte.json
|
||||
var defaultSchemaJSON []byte
|
||||
//go:embed marte.cue
|
||||
var defaultSchemaCUE []byte
|
||||
|
||||
type Schema struct {
|
||||
Classes map[string]ClassDefinition `json:"classes"`
|
||||
}
|
||||
|
||||
type ClassDefinition struct {
|
||||
Fields []FieldDefinition `json:"fields"`
|
||||
Ordered bool `json:"ordered"`
|
||||
}
|
||||
|
||||
type FieldDefinition struct {
|
||||
Name string `json:"name"`
|
||||
Type string `json:"type"` // "int", "float", "string", "bool", "reference", "array", "node", "any"
|
||||
Mandatory bool `json:"mandatory"`
|
||||
Context *cue.Context
|
||||
Value cue.Value
|
||||
}
|
||||
|
||||
func NewSchema() *Schema {
|
||||
ctx := cuecontext.New()
|
||||
return &Schema{
|
||||
Classes: make(map[string]ClassDefinition),
|
||||
Context: ctx,
|
||||
Value: ctx.CompileBytes(defaultSchemaCUE),
|
||||
}
|
||||
}
|
||||
|
||||
func LoadSchema(path string) (*Schema, error) {
|
||||
// LoadSchema loads a CUE schema from a file and returns the cue.Value
|
||||
func LoadSchema(ctx *cue.Context, path string) (cue.Value, error) {
|
||||
content, err := os.ReadFile(path)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
var s Schema
|
||||
if err := json.Unmarshal(content, &s); err != nil {
|
||||
return nil, fmt.Errorf("failed to parse schema: %v", err)
|
||||
}
|
||||
|
||||
return &s, nil
|
||||
}
|
||||
|
||||
// DefaultSchema returns the built-in embedded schema
|
||||
func DefaultSchema() *Schema {
|
||||
var s Schema
|
||||
if err := json.Unmarshal(defaultSchemaJSON, &s); err != nil {
|
||||
panic(fmt.Sprintf("failed to parse default embedded schema: %v", err))
|
||||
}
|
||||
if s.Classes == nil {
|
||||
s.Classes = make(map[string]ClassDefinition)
|
||||
}
|
||||
return &s
|
||||
}
|
||||
|
||||
// Merge adds rules from 'other' to 's'.
|
||||
// Rules for the same class are merged (new fields added, existing fields updated).
|
||||
func (s *Schema) Merge(other *Schema) {
|
||||
if other == nil {
|
||||
return
|
||||
}
|
||||
for className, classDef := range other.Classes {
|
||||
if existingClass, ok := s.Classes[className]; ok {
|
||||
// Merge fields
|
||||
fieldMap := make(map[string]FieldDefinition)
|
||||
for _, f := range classDef.Fields {
|
||||
fieldMap[f.Name] = f
|
||||
}
|
||||
|
||||
var mergedFields []FieldDefinition
|
||||
seen := make(map[string]bool)
|
||||
|
||||
// Keep existing fields, update if present in other
|
||||
for _, f := range existingClass.Fields {
|
||||
if newF, ok := fieldMap[f.Name]; ok {
|
||||
mergedFields = append(mergedFields, newF)
|
||||
} else {
|
||||
mergedFields = append(mergedFields, f)
|
||||
}
|
||||
seen[f.Name] = true
|
||||
}
|
||||
|
||||
// Append new fields
|
||||
for _, f := range classDef.Fields {
|
||||
if !seen[f.Name] {
|
||||
mergedFields = append(mergedFields, f)
|
||||
}
|
||||
}
|
||||
|
||||
existingClass.Fields = mergedFields
|
||||
if classDef.Ordered {
|
||||
existingClass.Ordered = true
|
||||
}
|
||||
s.Classes[className] = existingClass
|
||||
} else {
|
||||
s.Classes[className] = classDef
|
||||
}
|
||||
return cue.Value{}, err
|
||||
}
|
||||
return ctx.CompileBytes(content), nil
|
||||
}
|
||||
|
||||
func LoadFullSchema(projectRoot string) *Schema {
|
||||
s := DefaultSchema()
|
||||
ctx := cuecontext.New()
|
||||
baseVal := ctx.CompileBytes(defaultSchemaCUE)
|
||||
if baseVal.Err() != nil {
|
||||
// Fallback or panic? Panic is appropriate for embedded schema failure
|
||||
panic(fmt.Sprintf("Embedded schema invalid: %v", baseVal.Err()))
|
||||
}
|
||||
|
||||
// 1. System Paths
|
||||
sysPaths := []string{
|
||||
"/usr/share/mdt/marte_schema.json",
|
||||
"/usr/share/mdt/marte_schema.cue",
|
||||
}
|
||||
|
||||
|
||||
home, err := os.UserHomeDir()
|
||||
if err == nil {
|
||||
sysPaths = append(sysPaths, filepath.Join(home, ".local/share/mdt/marte_schema.json"))
|
||||
sysPaths = append(sysPaths, filepath.Join(home, ".local/share/mdt/marte_schema.cue"))
|
||||
}
|
||||
|
||||
for _, path := range sysPaths {
|
||||
if sysSchema, err := LoadSchema(path); err == nil {
|
||||
s.Merge(sysSchema)
|
||||
if val, err := LoadSchema(ctx, path); err == nil && val.Err() == nil {
|
||||
baseVal = baseVal.Unify(val)
|
||||
}
|
||||
}
|
||||
|
||||
// 2. Project Path
|
||||
if projectRoot != "" {
|
||||
projectSchemaPath := filepath.Join(projectRoot, ".marte_schema.json")
|
||||
if projSchema, err := LoadSchema(projectSchemaPath); err == nil {
|
||||
s.Merge(projSchema)
|
||||
projectSchemaPath := filepath.Join(projectRoot, ".marte_schema.cue")
|
||||
if val, err := LoadSchema(ctx, projectSchemaPath); err == nil && val.Err() == nil {
|
||||
baseVal = baseVal.Unify(val)
|
||||
}
|
||||
}
|
||||
|
||||
return s
|
||||
}
|
||||
return &Schema{
|
||||
Context: ctx,
|
||||
Value: baseVal,
|
||||
}
|
||||
}
|
||||
|
||||
@@ -2,9 +2,15 @@ package validator
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/schema"
|
||||
"strconv"
|
||||
"strings"
|
||||
|
||||
"cuelang.org/go/cue"
|
||||
"cuelang.org/go/cue/errors"
|
||||
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/schema"
|
||||
)
|
||||
|
||||
type DiagnosticLevel int
|
||||
@@ -38,6 +44,9 @@ func (v *Validator) ValidateProject() {
|
||||
if v.Tree == nil {
|
||||
return
|
||||
}
|
||||
// Ensure references are resolved (if not already done by builder/lsp)
|
||||
v.Tree.ResolveReferences()
|
||||
|
||||
if v.Tree.Root != nil {
|
||||
v.validateNode(v.Tree.Root)
|
||||
}
|
||||
@@ -47,25 +56,27 @@ func (v *Validator) ValidateProject() {
|
||||
}
|
||||
|
||||
func (v *Validator) validateNode(node *index.ProjectNode) {
|
||||
// Collect fields and their definitions
|
||||
fields := make(map[string][]*parser.Field)
|
||||
fieldOrder := []string{} // Keep track of order of appearance (approximate across fragments)
|
||||
|
||||
for _, frag := range node.Fragments {
|
||||
for _, def := range frag.Definitions {
|
||||
if f, ok := def.(*parser.Field); ok {
|
||||
if _, exists := fields[f.Name]; !exists {
|
||||
fieldOrder = append(fieldOrder, f.Name)
|
||||
// Check for invalid content in Signals container of DataSource
|
||||
if node.RealName == "Signals" && node.Parent != nil && isDataSource(node.Parent) {
|
||||
for _, frag := range node.Fragments {
|
||||
for _, def := range frag.Definitions {
|
||||
if f, ok := def.(*parser.Field); ok {
|
||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||
Level: LevelError,
|
||||
Message: fmt.Sprintf("Invalid content in Signals container: Field '%s' is not allowed. Only Signal objects are allowed.", f.Name),
|
||||
Position: f.Position,
|
||||
File: frag.File,
|
||||
})
|
||||
}
|
||||
fields[f.Name] = append(fields[f.Name], f)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 1. Check for duplicate fields
|
||||
fields := v.getFields(node)
|
||||
|
||||
// 1. Check for duplicate fields (Go logic)
|
||||
for name, defs := range fields {
|
||||
if len(defs) > 1 {
|
||||
// Report error on the second definition
|
||||
firstFile := v.getFileForField(defs[0], node)
|
||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||
Level: LevelError,
|
||||
@@ -80,13 +91,7 @@ func (v *Validator) validateNode(node *index.ProjectNode) {
|
||||
className := ""
|
||||
if node.RealName != "" && (node.RealName[0] == '+' || node.RealName[0] == '$') {
|
||||
if classFields, ok := fields["Class"]; ok && len(classFields) > 0 {
|
||||
// Extract class name from value
|
||||
switch val := classFields[0].Value.(type) {
|
||||
case *parser.StringValue:
|
||||
className = val.Value
|
||||
case *parser.ReferenceValue:
|
||||
className = val.Value
|
||||
}
|
||||
className = v.getFieldValue(classFields[0])
|
||||
}
|
||||
|
||||
hasType := false
|
||||
@@ -104,13 +109,25 @@ func (v *Validator) validateNode(node *index.ProjectNode) {
|
||||
File: file,
|
||||
})
|
||||
}
|
||||
|
||||
if className == "RealTimeThread" {
|
||||
v.checkFunctionsArray(node, fields)
|
||||
}
|
||||
}
|
||||
|
||||
// 3. Schema Validation
|
||||
// 3. CUE Validation
|
||||
if className != "" && v.Schema != nil {
|
||||
if classDef, ok := v.Schema.Classes[className]; ok {
|
||||
v.validateClass(node, classDef, fields, fieldOrder)
|
||||
}
|
||||
v.validateWithCUE(node, className)
|
||||
}
|
||||
|
||||
// 4. Signal Validation (for DataSource signals)
|
||||
if isSignal(node) {
|
||||
v.validateSignal(node, fields)
|
||||
}
|
||||
|
||||
// 5. GAM Validation (Signal references)
|
||||
if isGAM(node) {
|
||||
v.validateGAM(node)
|
||||
}
|
||||
|
||||
// Recursively validate children
|
||||
@@ -119,113 +136,414 @@ func (v *Validator) validateNode(node *index.ProjectNode) {
|
||||
}
|
||||
}
|
||||
|
||||
func (v *Validator) validateClass(node *index.ProjectNode, classDef schema.ClassDefinition, fields map[string][]*parser.Field, fieldOrder []string) {
|
||||
// Check Mandatory Fields
|
||||
for _, fieldDef := range classDef.Fields {
|
||||
if fieldDef.Mandatory {
|
||||
found := false
|
||||
if _, ok := fields[fieldDef.Name]; ok {
|
||||
found = true
|
||||
} else if fieldDef.Type == "node" {
|
||||
// Check children for nodes
|
||||
if _, ok := node.Children[fieldDef.Name]; ok {
|
||||
found = true
|
||||
func (v *Validator) validateWithCUE(node *index.ProjectNode, className string) {
|
||||
// Check if class exists in schema
|
||||
classPath := cue.ParsePath(fmt.Sprintf("#Classes.%s", className))
|
||||
if v.Schema.Value.LookupPath(classPath).Err() != nil {
|
||||
return // Unknown class, skip validation
|
||||
}
|
||||
|
||||
// Convert node to map
|
||||
data := v.nodeToMap(node)
|
||||
|
||||
// Encode data to CUE
|
||||
dataVal := v.Schema.Context.Encode(data)
|
||||
|
||||
// Unify with #Object
|
||||
// #Object requires "Class" field, which is present in data.
|
||||
objDef := v.Schema.Value.LookupPath(cue.ParsePath("#Object"))
|
||||
|
||||
// Unify
|
||||
res := objDef.Unify(dataVal)
|
||||
|
||||
if err := res.Validate(cue.Concrete(true)); err != nil {
|
||||
// Report errors
|
||||
|
||||
// Parse CUE error to diagnostic
|
||||
v.reportCUEError(err, node)
|
||||
}
|
||||
}
|
||||
|
||||
func (v *Validator) reportCUEError(err error, node *index.ProjectNode) {
|
||||
list := errors.Errors(err)
|
||||
for _, e := range list {
|
||||
msg := e.Error()
|
||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||
Level: LevelError,
|
||||
Message: fmt.Sprintf("Schema Validation Error: %v", msg),
|
||||
Position: v.getNodePosition(node),
|
||||
File: v.getNodeFile(node),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func (v *Validator) nodeToMap(node *index.ProjectNode) map[string]interface{} {
|
||||
m := make(map[string]interface{})
|
||||
fields := v.getFields(node)
|
||||
|
||||
for name, defs := range fields {
|
||||
if len(defs) > 0 {
|
||||
// Use the last definition (duplicates checked elsewhere)
|
||||
m[name] = v.valueToInterface(defs[len(defs)-1].Value)
|
||||
}
|
||||
}
|
||||
|
||||
// Children as nested maps?
|
||||
// CUE schema expects nested structs for "node" type fields.
|
||||
// But `node.Children` contains ALL children (even those defined as +Child).
|
||||
// If schema expects `States: { ... }`, we map children.
|
||||
|
||||
for name, child := range node.Children {
|
||||
// normalize name? CUE keys are strings.
|
||||
// If child real name is "+States", key in Children is "States".
|
||||
// We use "States" as key in map.
|
||||
m[name] = v.nodeToMap(child)
|
||||
}
|
||||
|
||||
return m
|
||||
}
|
||||
|
||||
func (v *Validator) valueToInterface(val parser.Value) interface{} {
|
||||
switch t := val.(type) {
|
||||
case *parser.StringValue:
|
||||
return t.Value
|
||||
case *parser.IntValue:
|
||||
i, _ := strconv.ParseInt(t.Raw, 0, 64)
|
||||
return i // CUE handles int64
|
||||
case *parser.FloatValue:
|
||||
f, _ := strconv.ParseFloat(t.Raw, 64)
|
||||
return f
|
||||
case *parser.BoolValue:
|
||||
return t.Value
|
||||
case *parser.ReferenceValue:
|
||||
return t.Value
|
||||
case *parser.ArrayValue:
|
||||
var arr []interface{}
|
||||
for _, e := range t.Elements {
|
||||
arr = append(arr, v.valueToInterface(e))
|
||||
}
|
||||
return arr
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func (v *Validator) validateSignal(node *index.ProjectNode, fields map[string][]*parser.Field) {
|
||||
// ... (same as before)
|
||||
if typeFields, ok := fields["Type"]; !ok || len(typeFields) == 0 {
|
||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||
Level: LevelError,
|
||||
Message: fmt.Sprintf("Signal '%s' is missing mandatory field 'Type'", node.RealName),
|
||||
Position: v.getNodePosition(node),
|
||||
File: v.getNodeFile(node),
|
||||
})
|
||||
} else {
|
||||
typeVal := typeFields[0].Value
|
||||
var typeStr string
|
||||
switch t := typeVal.(type) {
|
||||
case *parser.StringValue:
|
||||
typeStr = t.Value
|
||||
case *parser.ReferenceValue:
|
||||
typeStr = t.Value
|
||||
default:
|
||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||
Level: LevelError,
|
||||
Message: fmt.Sprintf("Field 'Type' in Signal '%s' must be a type name", node.RealName),
|
||||
Position: typeFields[0].Position,
|
||||
File: v.getFileForField(typeFields[0], node),
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
if !isValidType(typeStr) {
|
||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||
Level: LevelError,
|
||||
Message: fmt.Sprintf("Invalid Type '%s' for Signal '%s'", typeStr, node.RealName),
|
||||
Position: typeFields[0].Position,
|
||||
File: v.getFileForField(typeFields[0], node),
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func (v *Validator) validateGAM(node *index.ProjectNode) {
|
||||
if inputs, ok := node.Children["InputSignals"]; ok {
|
||||
v.validateGAMSignals(node, inputs, "Input")
|
||||
}
|
||||
if outputs, ok := node.Children["OutputSignals"]; ok {
|
||||
v.validateGAMSignals(node, outputs, "Output")
|
||||
}
|
||||
}
|
||||
|
||||
func (v *Validator) validateGAMSignals(gamNode, signalsContainer *index.ProjectNode, direction string) {
|
||||
for _, signal := range signalsContainer.Children {
|
||||
v.validateGAMSignal(gamNode, signal, direction)
|
||||
}
|
||||
}
|
||||
|
||||
func (v *Validator) validateGAMSignal(gamNode, signalNode *index.ProjectNode, direction string) {
|
||||
fields := v.getFields(signalNode)
|
||||
var dsName string
|
||||
if dsFields, ok := fields["DataSource"]; ok && len(dsFields) > 0 {
|
||||
dsName = v.getFieldValue(dsFields[0])
|
||||
}
|
||||
|
||||
if dsName == "" {
|
||||
return // Ignore implicit signals or missing datasource (handled elsewhere if mandatory)
|
||||
}
|
||||
|
||||
dsNode := v.resolveReference(dsName, v.getNodeFile(signalNode), isDataSource)
|
||||
if dsNode == nil {
|
||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||
Level: LevelError,
|
||||
Message: fmt.Sprintf("Unknown DataSource '%s' referenced in signal '%s'", dsName, signalNode.RealName),
|
||||
Position: v.getNodePosition(signalNode),
|
||||
File: v.getNodeFile(signalNode),
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
// Link DataSource reference
|
||||
if dsFields, ok := fields["DataSource"]; ok && len(dsFields) > 0 {
|
||||
if val, ok := dsFields[0].Value.(*parser.ReferenceValue); ok {
|
||||
v.updateReferenceTarget(v.getNodeFile(signalNode), val.Position, dsNode)
|
||||
}
|
||||
}
|
||||
|
||||
// Check Direction using CUE Schema
|
||||
dsClass := v.getNodeClass(dsNode)
|
||||
if dsClass != "" {
|
||||
// Lookup class definition in Schema
|
||||
// path: #Classes.ClassName.direction
|
||||
path := cue.ParsePath(fmt.Sprintf("#Classes.%s.direction", dsClass))
|
||||
val := v.Schema.Value.LookupPath(path)
|
||||
|
||||
if val.Err() == nil {
|
||||
dsDir, err := val.String()
|
||||
if err == nil && dsDir != "" {
|
||||
if direction == "Input" && dsDir == "OUT" {
|
||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||
Level: LevelError,
|
||||
Message: fmt.Sprintf("DataSource '%s' (Class %s) is Output-only but referenced in InputSignals of GAM '%s'", dsName, dsClass, gamNode.RealName),
|
||||
Position: v.getNodePosition(signalNode),
|
||||
File: v.getNodeFile(signalNode),
|
||||
})
|
||||
}
|
||||
if direction == "Output" && dsDir == "IN" {
|
||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||
Level: LevelError,
|
||||
Message: fmt.Sprintf("DataSource '%s' (Class %s) is Input-only but referenced in OutputSignals of GAM '%s'", dsName, dsClass, gamNode.RealName),
|
||||
Position: v.getNodePosition(signalNode),
|
||||
File: v.getNodeFile(signalNode),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
if !found {
|
||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||
Level: LevelError,
|
||||
Message: fmt.Sprintf("Missing mandatory field '%s' for class '%s'", fieldDef.Name, node.Metadata["Class"]),
|
||||
Position: v.getNodePosition(node),
|
||||
File: v.getNodeFile(node),
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Check Field Types
|
||||
for _, fieldDef := range classDef.Fields {
|
||||
if fList, ok := fields[fieldDef.Name]; ok {
|
||||
f := fList[0] // Check the first definition (duplicates handled elsewhere)
|
||||
if !v.checkType(f.Value, fieldDef.Type) {
|
||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||
Level: LevelError,
|
||||
Message: fmt.Sprintf("Field '%s' expects type '%s'", fieldDef.Name, fieldDef.Type),
|
||||
Position: f.Position,
|
||||
File: v.getFileForField(f, node),
|
||||
})
|
||||
}
|
||||
}
|
||||
// Check Signal Existence
|
||||
targetSignalName := index.NormalizeName(signalNode.RealName)
|
||||
if aliasFields, ok := fields["Alias"]; ok && len(aliasFields) > 0 {
|
||||
targetSignalName = v.getFieldValue(aliasFields[0]) // Alias is usually the name in DataSource
|
||||
}
|
||||
|
||||
// Check Field Order
|
||||
if classDef.Ordered {
|
||||
// Verify that fields present in the node appear in the order defined in the schema
|
||||
// Only consider fields that are actually in the schema's field list
|
||||
schemaIdx := 0
|
||||
for _, nodeFieldName := range fieldOrder {
|
||||
// Find this field in schema
|
||||
foundInSchema := false
|
||||
for i, fd := range classDef.Fields {
|
||||
if fd.Name == nodeFieldName {
|
||||
foundInSchema = true
|
||||
// Check if this field appears AFTER the current expected position
|
||||
if i < schemaIdx {
|
||||
// This field appears out of order (it should have appeared earlier, or previous fields were missing but this one came too late? No, simple relative order)
|
||||
// Actually, simple check: `i` must be >= `lastSeenSchemaIdx`.
|
||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||
Level: LevelError,
|
||||
Message: fmt.Sprintf("Field '%s' is out of order", nodeFieldName),
|
||||
Position: fields[nodeFieldName][0].Position,
|
||||
File: v.getFileForField(fields[nodeFieldName][0], node),
|
||||
})
|
||||
} else {
|
||||
schemaIdx = i
|
||||
}
|
||||
var targetNode *index.ProjectNode
|
||||
if signalsContainer, ok := dsNode.Children["Signals"]; ok {
|
||||
targetNorm := index.NormalizeName(targetSignalName)
|
||||
|
||||
if child, ok := signalsContainer.Children[targetNorm]; ok {
|
||||
targetNode = child
|
||||
} else {
|
||||
// Fallback check
|
||||
for _, child := range signalsContainer.Children {
|
||||
if index.NormalizeName(child.RealName) == targetNorm {
|
||||
targetNode = child
|
||||
break
|
||||
}
|
||||
}
|
||||
if !foundInSchema {
|
||||
// Ignore extra fields for order check? Spec doesn't say strict closed schema.
|
||||
}
|
||||
}
|
||||
|
||||
if targetNode == nil {
|
||||
suppressed := v.isGloballyAllowed("implicit", v.getNodeFile(signalNode))
|
||||
if !suppressed {
|
||||
for _, p := range signalNode.Pragmas {
|
||||
if strings.HasPrefix(p, "implicit:") || strings.HasPrefix(p, "ignore(implicit)") {
|
||||
suppressed = true
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if !suppressed {
|
||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||
Level: LevelWarning,
|
||||
Message: fmt.Sprintf("Implicitly Defined Signal: '%s' is defined in GAM '%s' but not in DataSource '%s'", targetSignalName, gamNode.RealName, dsName),
|
||||
Position: v.getNodePosition(signalNode),
|
||||
File: v.getNodeFile(signalNode),
|
||||
})
|
||||
}
|
||||
|
||||
if typeFields, ok := fields["Type"]; !ok || len(typeFields) == 0 {
|
||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||
Level: LevelError,
|
||||
Message: fmt.Sprintf("Implicit signal '%s' must define Type", targetSignalName),
|
||||
Position: v.getNodePosition(signalNode),
|
||||
File: v.getNodeFile(signalNode),
|
||||
})
|
||||
} else {
|
||||
// Check Type validity even for implicit
|
||||
typeVal := v.getFieldValue(typeFields[0])
|
||||
if !isValidType(typeVal) {
|
||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||
Level: LevelError,
|
||||
Message: fmt.Sprintf("Invalid Type '%s' for Signal '%s'", typeVal, signalNode.RealName),
|
||||
Position: typeFields[0].Position,
|
||||
File: v.getNodeFile(signalNode),
|
||||
})
|
||||
}
|
||||
}
|
||||
} else {
|
||||
signalNode.Target = targetNode
|
||||
// Link Alias reference
|
||||
if aliasFields, ok := fields["Alias"]; ok && len(aliasFields) > 0 {
|
||||
if val, ok := aliasFields[0].Value.(*parser.ReferenceValue); ok {
|
||||
v.updateReferenceTarget(v.getNodeFile(signalNode), val.Position, targetNode)
|
||||
}
|
||||
}
|
||||
|
||||
// Property checks
|
||||
v.checkSignalProperty(signalNode, targetNode, "Type")
|
||||
v.checkSignalProperty(signalNode, targetNode, "NumberOfElements")
|
||||
v.checkSignalProperty(signalNode, targetNode, "NumberOfDimensions")
|
||||
|
||||
// Check Type validity if present
|
||||
if typeFields, ok := fields["Type"]; ok && len(typeFields) > 0 {
|
||||
typeVal := v.getFieldValue(typeFields[0])
|
||||
if !isValidType(typeVal) {
|
||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||
Level: LevelError,
|
||||
Message: fmt.Sprintf("Invalid Type '%s' for Signal '%s'", typeVal, signalNode.RealName),
|
||||
Position: typeFields[0].Position,
|
||||
File: v.getNodeFile(signalNode),
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func (v *Validator) checkType(val parser.Value, expectedType string) bool {
|
||||
switch expectedType {
|
||||
case "int":
|
||||
_, ok := val.(*parser.IntValue)
|
||||
return ok
|
||||
case "float":
|
||||
_, ok := val.(*parser.FloatValue)
|
||||
return ok
|
||||
case "string":
|
||||
_, ok := val.(*parser.StringValue)
|
||||
return ok
|
||||
case "bool":
|
||||
_, ok := val.(*parser.BoolValue)
|
||||
return ok
|
||||
case "array":
|
||||
_, ok := val.(*parser.ArrayValue)
|
||||
return ok
|
||||
case "reference":
|
||||
_, ok := val.(*parser.ReferenceValue)
|
||||
return ok
|
||||
case "node":
|
||||
// This is tricky. A field cannot really be a "node" type in the parser sense (Node = { ... } is an ObjectNode, not a Field).
|
||||
// But if the schema says "FieldX" is type "node", maybe it means it expects a reference to a node?
|
||||
// Or maybe it means it expects a Subnode?
|
||||
// In MARTe, `Field = { ... }` is parsed as ArrayValue usually.
|
||||
// If `Field = SubNode`, it's `ObjectNode`.
|
||||
// Schema likely refers to `+SubNode = { ... }`.
|
||||
// But `validateClass` iterates `fields`.
|
||||
// If schema defines a "field" of type "node", it might mean it expects a child node with that name.
|
||||
return true // skip for now
|
||||
case "any":
|
||||
func (v *Validator) checkSignalProperty(gamSig, dsSig *index.ProjectNode, prop string) {
|
||||
gamVal := gamSig.Metadata[prop]
|
||||
dsVal := dsSig.Metadata[prop]
|
||||
|
||||
if gamVal == "" {
|
||||
return
|
||||
}
|
||||
|
||||
if dsVal != "" && gamVal != dsVal {
|
||||
if prop == "Type" {
|
||||
if v.checkCastPragma(gamSig, dsVal, gamVal) {
|
||||
return
|
||||
}
|
||||
}
|
||||
|
||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||
Level: LevelError,
|
||||
Message: fmt.Sprintf("Signal '%s' property '%s' mismatch: defined '%s', referenced '%s'", gamSig.RealName, prop, dsVal, gamVal),
|
||||
Position: v.getNodePosition(gamSig),
|
||||
File: v.getNodeFile(gamSig),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func (v *Validator) checkCastPragma(node *index.ProjectNode, defType, curType string) bool {
|
||||
for _, p := range node.Pragmas {
|
||||
if strings.HasPrefix(p, "cast(") {
|
||||
content := strings.TrimPrefix(p, "cast(")
|
||||
if idx := strings.Index(content, ")"); idx != -1 {
|
||||
content = content[:idx]
|
||||
parts := strings.Split(content, ",")
|
||||
if len(parts) == 2 {
|
||||
d := strings.TrimSpace(parts[0])
|
||||
c := strings.TrimSpace(parts[1])
|
||||
if d == defType && c == curType {
|
||||
return true
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
func (v *Validator) updateReferenceTarget(file string, pos parser.Position, target *index.ProjectNode) {
|
||||
for i := range v.Tree.References {
|
||||
ref := &v.Tree.References[i]
|
||||
if ref.File == file && ref.Position == pos {
|
||||
ref.Target = target
|
||||
return
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Helpers
|
||||
|
||||
func (v *Validator) getFields(node *index.ProjectNode) map[string][]*parser.Field {
|
||||
fields := make(map[string][]*parser.Field)
|
||||
for _, frag := range node.Fragments {
|
||||
for _, def := range frag.Definitions {
|
||||
if f, ok := def.(*parser.Field); ok {
|
||||
fields[f.Name] = append(fields[f.Name], f)
|
||||
}
|
||||
}
|
||||
}
|
||||
return fields
|
||||
}
|
||||
|
||||
func (v *Validator) getFieldValue(f *parser.Field) string {
|
||||
switch val := f.Value.(type) {
|
||||
case *parser.StringValue:
|
||||
return val.Value
|
||||
case *parser.ReferenceValue:
|
||||
return val.Value
|
||||
case *parser.IntValue:
|
||||
return val.Raw
|
||||
case *parser.FloatValue:
|
||||
return val.Raw
|
||||
}
|
||||
return ""
|
||||
}
|
||||
|
||||
func (v *Validator) resolveReference(name string, file string, predicate func(*index.ProjectNode) bool) *index.ProjectNode {
|
||||
if isoNode, ok := v.Tree.IsolatedFiles[file]; ok {
|
||||
if found := v.Tree.FindNode(isoNode, name, predicate); found != nil {
|
||||
return found
|
||||
}
|
||||
return nil
|
||||
}
|
||||
if v.Tree.Root == nil {
|
||||
return nil
|
||||
}
|
||||
return v.Tree.FindNode(v.Tree.Root, name, predicate)
|
||||
}
|
||||
|
||||
func (v *Validator) getNodeClass(node *index.ProjectNode) string {
|
||||
if cls, ok := node.Metadata["Class"]; ok {
|
||||
return cls
|
||||
}
|
||||
return ""
|
||||
}
|
||||
|
||||
func isValidType(t string) bool {
|
||||
switch t {
|
||||
case "uint8", "int8", "uint16", "int16", "uint32", "int32", "uint64", "int64",
|
||||
"float32", "float64", "string", "bool", "char8":
|
||||
return true
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
func (v *Validator) checkType(val parser.Value, expectedType string) bool {
|
||||
// Legacy function, replaced by CUE.
|
||||
return true
|
||||
}
|
||||
|
||||
@@ -248,6 +566,13 @@ func (v *Validator) CheckUnused() {
|
||||
}
|
||||
}
|
||||
|
||||
if v.Tree.Root != nil {
|
||||
v.collectTargetUsage(v.Tree.Root, referencedNodes)
|
||||
}
|
||||
for _, node := range v.Tree.IsolatedFiles {
|
||||
v.collectTargetUsage(node, referencedNodes)
|
||||
}
|
||||
|
||||
if v.Tree.Root != nil {
|
||||
v.checkUnusedRecursive(v.Tree.Root, referencedNodes)
|
||||
}
|
||||
@@ -256,29 +581,63 @@ func (v *Validator) CheckUnused() {
|
||||
}
|
||||
}
|
||||
|
||||
func (v *Validator) collectTargetUsage(node *index.ProjectNode, referenced map[*index.ProjectNode]bool) {
|
||||
if node.Target != nil {
|
||||
referenced[node.Target] = true
|
||||
}
|
||||
for _, child := range node.Children {
|
||||
v.collectTargetUsage(child, referenced)
|
||||
}
|
||||
}
|
||||
|
||||
func (v *Validator) checkUnusedRecursive(node *index.ProjectNode, referenced map[*index.ProjectNode]bool) {
|
||||
// Heuristic for GAM
|
||||
if isGAM(node) {
|
||||
if !referenced[node] {
|
||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||
Level: LevelWarning,
|
||||
Message: fmt.Sprintf("Unused GAM: %s is defined but not referenced in any thread or scheduler", node.RealName),
|
||||
Position: v.getNodePosition(node),
|
||||
File: v.getNodeFile(node),
|
||||
})
|
||||
suppress := v.isGloballyAllowed("unused", v.getNodeFile(node))
|
||||
if !suppress {
|
||||
for _, p := range node.Pragmas {
|
||||
if strings.HasPrefix(p, "unused:") || strings.HasPrefix(p, "ignore(unused)") {
|
||||
suppress = true
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
if !suppress {
|
||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||
Level: LevelWarning,
|
||||
Message: fmt.Sprintf("Unused GAM: %s is defined but not referenced in any thread or scheduler", node.RealName),
|
||||
Position: v.getNodePosition(node),
|
||||
File: v.getNodeFile(node),
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Heuristic for DataSource and its signals
|
||||
if isDataSource(node) {
|
||||
for _, signal := range node.Children {
|
||||
if !referenced[signal] {
|
||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||
Level: LevelWarning,
|
||||
Message: fmt.Sprintf("Unused Signal: %s is defined in DataSource %s but never referenced", signal.RealName, node.RealName),
|
||||
Position: v.getNodePosition(signal),
|
||||
File: v.getNodeFile(signal),
|
||||
})
|
||||
if signalsNode, ok := node.Children["Signals"]; ok {
|
||||
for _, signal := range signalsNode.Children {
|
||||
if !referenced[signal] {
|
||||
if v.isGloballyAllowed("unused", v.getNodeFile(signal)) {
|
||||
continue
|
||||
}
|
||||
suppress := false
|
||||
for _, p := range signal.Pragmas {
|
||||
if strings.HasPrefix(p, "unused:") || strings.HasPrefix(p, "ignore(unused)") {
|
||||
suppress = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if !suppress {
|
||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||
Level: LevelWarning,
|
||||
Message: fmt.Sprintf("Unused Signal: %s is defined in DataSource %s but never referenced", signal.RealName, node.RealName),
|
||||
Position: v.getNodePosition(signal),
|
||||
File: v.getNodeFile(signal),
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -301,6 +660,16 @@ func isDataSource(node *index.ProjectNode) bool {
|
||||
if node.Parent != nil && node.Parent.Name == "Data" {
|
||||
return true
|
||||
}
|
||||
_, hasSignals := node.Children["Signals"]
|
||||
return hasSignals
|
||||
}
|
||||
|
||||
func isSignal(node *index.ProjectNode) bool {
|
||||
if node.Parent != nil && node.Parent.Name == "Signals" {
|
||||
if isDataSource(node.Parent.Parent) {
|
||||
return true
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
@@ -317,3 +686,63 @@ func (v *Validator) getNodeFile(node *index.ProjectNode) string {
|
||||
}
|
||||
return ""
|
||||
}
|
||||
|
||||
func (v *Validator) checkFunctionsArray(node *index.ProjectNode, fields map[string][]*parser.Field) {
|
||||
if funcs, ok := fields["Functions"]; ok && len(funcs) > 0 {
|
||||
f := funcs[0]
|
||||
if arr, ok := f.Value.(*parser.ArrayValue); ok {
|
||||
for _, elem := range arr.Elements {
|
||||
if ref, ok := elem.(*parser.ReferenceValue); ok {
|
||||
target := v.resolveReference(ref.Value, v.getNodeFile(node), isGAM)
|
||||
if target == nil {
|
||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||
Level: LevelError,
|
||||
Message: fmt.Sprintf("Function '%s' not found or is not a valid GAM", ref.Value),
|
||||
Position: ref.Position,
|
||||
File: v.getNodeFile(node),
|
||||
})
|
||||
}
|
||||
} else {
|
||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||
Level: LevelError,
|
||||
Message: "Functions array must contain references",
|
||||
Position: f.Position,
|
||||
File: v.getNodeFile(node),
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func (v *Validator) isGloballyAllowed(warningType string, contextFile string) bool {
|
||||
prefix1 := fmt.Sprintf("allow(%s)", warningType)
|
||||
prefix2 := fmt.Sprintf("ignore(%s)", warningType)
|
||||
|
||||
// If context file is isolated, only check its own pragmas
|
||||
if _, isIsolated := v.Tree.IsolatedFiles[contextFile]; isIsolated {
|
||||
if pragmas, ok := v.Tree.GlobalPragmas[contextFile]; ok {
|
||||
for _, p := range pragmas {
|
||||
normalized := strings.ReplaceAll(p, " ", "")
|
||||
if strings.HasPrefix(normalized, prefix1) || strings.HasPrefix(normalized, prefix2) {
|
||||
return true
|
||||
}
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
// If project file, check all non-isolated files
|
||||
for file, pragmas := range v.Tree.GlobalPragmas {
|
||||
if _, isIsolated := v.Tree.IsolatedFiles[file]; isIsolated {
|
||||
continue
|
||||
}
|
||||
for _, p := range pragmas {
|
||||
normalized := strings.ReplaceAll(p, " ", "")
|
||||
if strings.HasPrefix(normalized, prefix1) || strings.HasPrefix(normalized, prefix2) {
|
||||
return true
|
||||
}
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
@@ -29,7 +29,12 @@ The LSP server should provide the following capabilities:
|
||||
- **Go to Definition**: Jump to the definition of a reference, supporting navigation across any file in the current project.
|
||||
- **Go to References**: Find usages of a node or field, supporting navigation across any file in the current project.
|
||||
- **Code Completion**: Autocomplete fields, values, and references.
|
||||
- **Code Snippets**: Provide snippets for common patterns.
|
||||
- **Context-Aware**: Suggestions depend on the cursor position (e.g., inside an object, assigning a value).
|
||||
- **Schema-Driven**: Field suggestions are derived from the CUE schema for the current object's Class, indicating mandatory vs. optional fields.
|
||||
- **Reference Suggestions**:
|
||||
- `DataSource` fields suggest available DataSource objects.
|
||||
- `Functions` (in Threads) suggest available GAM objects.
|
||||
- **Code Snippets**: Provide snippets for common patterns (e.g., `+Object = { ... }`).
|
||||
- **Formatting**: Format the document using the same rules and engine as the `fmt` command.
|
||||
|
||||
## Build System & File Structure
|
||||
@@ -47,9 +52,9 @@ The LSP server should provide the following capabilities:
|
||||
- **Namespace Consistency**: The build tool must verify that all input files belong to the same project namespace (the first segment of the `#package` URI). If multiple project namespaces are detected, the build must fail with an error.
|
||||
- **Target**: The build output is written to a single target file (e.g., provided via CLI or API).
|
||||
- **Multi-File Definitions**: Nodes and objects can be defined across multiple files. The build tool, validator, and LSP must merge these definitions (including all fields and sub-nodes) from the entire project to create a unified view before processing or validating.
|
||||
- **Global References**: References to nodes, signals, or objects can point to definitions located in any file within the project.
|
||||
- **Merging Order**: For objects defined across multiple files, the **first file** to be considered is the one containing the `Class` field definition.
|
||||
- **Field Order**: Within a single file, the relative order of defined fields must be maintained.
|
||||
- **Global References**: References to nodes, signals, or objects can point to definitions located in any file within the project. Support for dot-separated paths (e.g., `Node.SubNode`) is required.
|
||||
- **Merging Order**: For objects defined across multiple files, definitions are merged. The build tool must preserve the relative order of fields and sub-nodes as they appear in the source files, interleaving them correctly in the final output.
|
||||
- **Field Order**: Within a single file (and across merged files), the relative order of defined fields must be maintained in the output.
|
||||
- The LSP indexes only files belonging to the same project/namespace scope.
|
||||
- **Output**: The output format is the same as the input configuration but without the `#package` macro.
|
||||
|
||||
@@ -84,8 +89,13 @@ The LSP server should provide the following capabilities:
|
||||
- **Nodes (`+` / `$`)**: The prefixes `+` and `$` indicate that the node represents an object.
|
||||
- **Constraint**: These nodes _must_ contain a field named `Class` within their subnode definition (across all files where the node is defined).
|
||||
- **Signals**: Signals are considered nodes but **not** objects. They do not require a `Class` field.
|
||||
- **Pragmas (`//!`)**: Used to suppress specific diagnostics. The developer can use these to explain why a rule is being ignored.
|
||||
- **Pragmas (`//!`)**: Used to suppress specific diagnostics. The developer can use these to explain why a rule is being ignored. Supported pragmas:
|
||||
- `//!unused: REASON` or `//!ignore(unused): REASON` - Suppress "Unused GAM" or "Unused Signal" warnings.
|
||||
- `//!implicit: REASON` or `//!ignore(implicit): REASON` - Suppress "Implicitly Defined Signal" warnings.
|
||||
- `//!allow(WARNING_TYPE): REASON` or `//!ignore(WARNING_TYPE): REASON` - Global suppression for a specific warning type across the whole project (supported: `unused`, `implicit`).
|
||||
- `//!cast(DEF_TYPE, CUR_TYPE): REASON` - Suppress "Type Inconsistency" errors if types match.
|
||||
- **Structure**: A configuration is composed by one or more definitions.
|
||||
- **Strictness**: Any content that is not a valid comment (or pragma/docstring) or a valid definition (Field, Node, or Object) is **not allowed** and must generate a parsing error.
|
||||
|
||||
### Core MARTe Classes
|
||||
|
||||
@@ -105,29 +115,33 @@ MARTe configurations typically involve several main categories of objects:
|
||||
- **Requirements**:
|
||||
- All signal definitions **must** include a `Type` field with a valid value.
|
||||
- **Size Information**: Signals can optionally include `NumberOfDimensions` and `NumberOfElements` fields. If not explicitly defined, these default to `1`.
|
||||
- **Property Matching**: Signal references in GAMs must match the properties (`Type`, `NumberOfElements`, `NumberOfDimensions`) of the defined signal in the `DataSource`.
|
||||
- **Extensibility**: Signal definitions can include additional fields as required by the specific application context.
|
||||
- **Signal Reference Syntax**:
|
||||
- Signals are referenced or defined in `InputSignals` or `OutputSignals` sub-nodes using one of the following formats:
|
||||
1. **Direct Reference**:
|
||||
1. **Direct Reference (Option 1)**:
|
||||
```
|
||||
SIGNAL_NAME = {
|
||||
DataSource = SIGNAL_DATASOURCE
|
||||
DataSource = DATASOURCE_NAME
|
||||
// Other fields if necessary
|
||||
}
|
||||
```
|
||||
2. **Aliased Reference**:
|
||||
In this case, the GAM signal name is the same as the DataSource signal name.
|
||||
2. **Aliased Reference (Option 2)**:
|
||||
```
|
||||
NAME = {
|
||||
GAM_SIGNAL_NAME = {
|
||||
Alias = SIGNAL_NAME
|
||||
DataSource = SIGNAL_DATASOURCE
|
||||
DataSource = DATASOURCE_NAME
|
||||
// ...
|
||||
}
|
||||
```
|
||||
In this case, `Alias` points to the DataSource signal name.
|
||||
- **Implicit Definition Constraint**: If a signal is implicitly defined within a GAM, the `Type` field **must** be present in the reference block to define the signal's properties.
|
||||
- **Directionality**: DataSources and their signals are directional:
|
||||
- `Input`: Only providing data.
|
||||
- `Output`: Only receiving data.
|
||||
- `Inout`: Bidirectional data flow.
|
||||
- `Input` (IN): Only providing data. Signals can only be used in `InputSignals`.
|
||||
- `Output` (OUT): Only receiving data. Signals can only be used in `OutputSignals`.
|
||||
- `Inout` (INOUT): Bidirectional data flow. Signals can be used in both `InputSignals` and `OutputSignals`.
|
||||
- **Validation**: The tool must validate that signal usage in GAMs respects the direction of the referenced DataSource.
|
||||
|
||||
### Object Indexing & References
|
||||
|
||||
@@ -151,13 +165,13 @@ The tool must build an index of the configuration to support LSP features and va
|
||||
- **Field Order**: Verification that specific fields appear in a prescribed order when required by the class definition.
|
||||
- **Conditional Fields**: Validation of fields whose presence or value depends on the values of other fields within the same node or context.
|
||||
- **Schema Definition**:
|
||||
- Class validation rules must be defined in a separate schema file.
|
||||
- Class validation rules must be defined in a separate schema file using the **CUE** language.
|
||||
- **Project-Specific Classes**: Developers can define their own project-specific classes and corresponding validation rules, expanding the validation capabilities for their specific needs.
|
||||
- **Schema Loading**:
|
||||
- **Default Schema**: The tool should look for a default schema file `marte_schema.json` in standard system locations:
|
||||
- `/usr/share/mdt/marte_schema.json`
|
||||
- `$HOME/.local/share/mdt/marte_schema.json`
|
||||
- **Project Schema**: If a file named `.marte_schema.json` exists in the project root, it must be loaded.
|
||||
- **Default Schema**: The tool should look for a default schema file `marte_schema.cue` in standard system locations:
|
||||
- `/usr/share/mdt/marte_schema.cue`
|
||||
- `$HOME/.local/share/mdt/marte_schema.cue`
|
||||
- **Project Schema**: If a file named `.marte_schema.cue` exists in the project root, it must be loaded.
|
||||
- **Merging**: The final schema is a merge of the built-in schema, the system default schema (if found), and the project-specific schema. Rules in later sources (Project > System > Built-in) append to or override earlier ones.
|
||||
- **Duplicate Fields**:
|
||||
- **Constraint**: A field must not be defined more than once within the same object/node scope, even if those definitions are spread across different files.
|
||||
@@ -183,18 +197,20 @@ The `fmt` command must format the code according to the following rules:
|
||||
The LSP and `check` command should report the following:
|
||||
|
||||
- **Warnings**:
|
||||
- **Unused GAM**: A GAM is defined but not referenced in any thread or scheduler.
|
||||
- **Unused Signal**: A signal is explicitly defined in a `DataSource` but never referenced in any `GAM`.
|
||||
- **Implicitly Defined Signal**: A signal is defined only within a `GAM` and not in its parent `DataSource`.
|
||||
- **Unused GAM**: A GAM is defined but not referenced in any thread or scheduler. (Suppress with `//!unused`)
|
||||
- **Unused Signal**: A signal is explicitly defined in a `DataSource` but never referenced in any `GAM`. (Suppress with `//!unused`)
|
||||
- **Implicitly Defined Signal**: A signal is defined only within a `GAM` and not in its parent `DataSource`. (Suppress with `//!implicit`)
|
||||
|
||||
- **Errors**:
|
||||
- **Type Inconsistency**: A signal is referenced with a type different from its definition.
|
||||
- **Type Inconsistency**: A signal is referenced with a type different from its definition. (Suppress with `//!cast`)
|
||||
- **Size Inconsistency**: A signal is referenced with a size (dimensions/elements) different from its definition.
|
||||
- **Invalid Signal Content**: The `Signals` container of a `DataSource` contains invalid elements (e.g., fields instead of nodes).
|
||||
- **Duplicate Field Definition**: A field is defined multiple times within the same node scope (including across multiple files).
|
||||
- **Validation Errors**:
|
||||
- Missing mandatory fields.
|
||||
- Field type mismatches.
|
||||
- Grammar errors (e.g., missing closing brackets).
|
||||
- **Invalid Function Reference**: Elements in the `Functions` array of a `State.Thread` must be valid references to defined GAM nodes.
|
||||
|
||||
## Logging
|
||||
|
||||
|
||||
@@ -5,7 +5,7 @@ import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-dev/marte-dev-tools/internal/builder"
|
||||
"github.com/marte-community/marte-dev-tools/internal/builder"
|
||||
)
|
||||
|
||||
func TestMultiFileBuildMergeAndOrder(t *testing.T) {
|
||||
@@ -18,7 +18,7 @@ func TestMultiFileBuildMergeAndOrder(t *testing.T) {
|
||||
// File 1: Has FieldA, no Class.
|
||||
// File 2: Has Class, FieldB.
|
||||
// Both in package +MyObj
|
||||
|
||||
|
||||
f1Content := `
|
||||
#package Proj.+MyObj
|
||||
FieldA = 10
|
||||
@@ -30,10 +30,10 @@ FieldB = 20
|
||||
`
|
||||
os.WriteFile("build_multi_test/f1.marte", []byte(f1Content), 0644)
|
||||
os.WriteFile("build_multi_test/f2.marte", []byte(f2Content), 0644)
|
||||
|
||||
|
||||
// Execute Build
|
||||
b := builder.NewBuilder([]string{"build_multi_test/f1.marte", "build_multi_test/f2.marte"})
|
||||
|
||||
|
||||
// Prepare output file
|
||||
// Should be +MyObj.marte (normalized MyObj.marte) - Actually checking content
|
||||
outputFile := "build_multi_test/MyObj.marte"
|
||||
@@ -48,19 +48,19 @@ FieldB = 20
|
||||
t.Fatalf("Build failed: %v", err)
|
||||
}
|
||||
f.Close() // Close to flush
|
||||
|
||||
|
||||
// Check Output
|
||||
if _, err := os.Stat(outputFile); os.IsNotExist(err) {
|
||||
t.Fatalf("Expected output file not found")
|
||||
}
|
||||
|
||||
|
||||
content, err := os.ReadFile(outputFile)
|
||||
if err != nil {
|
||||
t.Fatalf("Failed to read output: %v", err)
|
||||
}
|
||||
|
||||
|
||||
output := string(content)
|
||||
|
||||
|
||||
// Check presence
|
||||
if !strings.Contains(output, "Class = \"MyClass\"") {
|
||||
t.Error("Output missing Class")
|
||||
@@ -71,23 +71,23 @@ FieldB = 20
|
||||
if !strings.Contains(output, "FieldB = 20") {
|
||||
t.Error("Output missing FieldB")
|
||||
}
|
||||
|
||||
|
||||
// Check Order: Class/FieldB (from f2) should come BEFORE FieldA (from f1)
|
||||
// because f2 has the Class definition.
|
||||
|
||||
|
||||
idxClass := strings.Index(output, "Class")
|
||||
idxFieldB := strings.Index(output, "FieldB")
|
||||
idxFieldA := strings.Index(output, "FieldA")
|
||||
|
||||
|
||||
if idxClass == -1 || idxFieldB == -1 || idxFieldA == -1 {
|
||||
t.Fatal("Missing fields in output")
|
||||
}
|
||||
|
||||
|
||||
// Class should be first
|
||||
if idxClass > idxFieldA {
|
||||
t.Errorf("Expected Class (from f2) to be before FieldA (from f1). Output:\n%s", output)
|
||||
}
|
||||
|
||||
|
||||
// FieldB should be near Class (same fragment)
|
||||
// FieldA should be after
|
||||
if idxFieldB > idxFieldA {
|
||||
|
||||
@@ -7,11 +7,11 @@ import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-dev/marte-dev-tools/internal/builder"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/formatter"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
||||
"github.com/marte-community/marte-dev-tools/internal/builder"
|
||||
"github.com/marte-community/marte-dev-tools/internal/formatter"
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func TestCheckCommand(t *testing.T) {
|
||||
@@ -120,7 +120,7 @@ func TestFmtCommand(t *testing.T) {
|
||||
formatter.Format(config, &buf)
|
||||
|
||||
output := buf.String()
|
||||
|
||||
|
||||
// Check for indentation
|
||||
if !strings.Contains(output, " Class = \"MyClass\"") {
|
||||
t.Error("Expected 2-space indentation for Class field")
|
||||
@@ -169,7 +169,7 @@ func TestBuildCommand(t *testing.T) {
|
||||
// Test Merge
|
||||
files := []string{"integration/build_merge_1.marte", "integration/build_merge_2.marte"}
|
||||
b := builder.NewBuilder(files)
|
||||
|
||||
|
||||
outputFile, err := os.Create("build_test/TEST.marte")
|
||||
if err != nil {
|
||||
t.Fatalf("Failed to create output file: %v", err)
|
||||
@@ -180,23 +180,23 @@ func TestBuildCommand(t *testing.T) {
|
||||
if err != nil {
|
||||
t.Fatalf("Build failed: %v", err)
|
||||
}
|
||||
|
||||
|
||||
// Check output existence
|
||||
if _, err := os.Stat("build_test/TEST.marte"); os.IsNotExist(err) {
|
||||
t.Fatalf("Expected output file build_test/TEST.marte not found")
|
||||
}
|
||||
|
||||
|
||||
content, _ := ioutil.ReadFile("build_test/TEST.marte")
|
||||
output := string(content)
|
||||
|
||||
|
||||
if !strings.Contains(output, "FieldA = 1") || !strings.Contains(output, "FieldB = 2") {
|
||||
t.Error("Merged output missing fields")
|
||||
}
|
||||
|
||||
|
||||
// Test Order (Class First)
|
||||
filesOrder := []string{"integration/build_order_1.marte", "integration/build_order_2.marte"}
|
||||
bOrder := builder.NewBuilder(filesOrder)
|
||||
|
||||
|
||||
outputFileOrder, err := os.Create("build_test/ORDER.marte")
|
||||
if err != nil {
|
||||
t.Fatalf("Failed to create output file: %v", err)
|
||||
@@ -207,18 +207,18 @@ func TestBuildCommand(t *testing.T) {
|
||||
if err != nil {
|
||||
t.Fatalf("Build order test failed: %v", err)
|
||||
}
|
||||
|
||||
|
||||
contentOrder, _ := ioutil.ReadFile("build_test/ORDER.marte")
|
||||
outputOrder := string(contentOrder)
|
||||
|
||||
|
||||
// Check for Class before Field
|
||||
classIdx := strings.Index(outputOrder, "Class = \"Ordered\"")
|
||||
fieldIdx := strings.Index(outputOrder, "Field = 1")
|
||||
|
||||
|
||||
if classIdx == -1 || fieldIdx == -1 {
|
||||
t.Fatal("Missing Class or Field in ordered output")
|
||||
}
|
||||
if classIdx > fieldIdx {
|
||||
t.Error("Expected Class to appear before Field in merged output")
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
320
test/lsp_completion_test.go
Normal file
320
test/lsp_completion_test.go
Normal file
@@ -0,0 +1,320 @@
|
||||
package integration
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/lsp"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/schema"
|
||||
)
|
||||
|
||||
func TestHandleCompletion(t *testing.T) {
|
||||
setup := func() {
|
||||
lsp.Tree = index.NewProjectTree()
|
||||
lsp.Documents = make(map[string]string)
|
||||
lsp.ProjectRoot = "."
|
||||
lsp.GlobalSchema = schema.NewSchema()
|
||||
}
|
||||
|
||||
uri := "file://test.marte"
|
||||
path := "test.marte"
|
||||
|
||||
t.Run("Suggest Classes", func(t *testing.T) {
|
||||
setup()
|
||||
content := "+Obj = { Class = "
|
||||
lsp.Documents[uri] = content
|
||||
|
||||
params := lsp.CompletionParams{
|
||||
TextDocument: lsp.TextDocumentIdentifier{URI: uri},
|
||||
Position: lsp.Position{Line: 0, Character: len(content)},
|
||||
}
|
||||
|
||||
list := lsp.HandleCompletion(params)
|
||||
if list == nil || len(list.Items) == 0 {
|
||||
t.Fatal("Expected class suggestions, got none")
|
||||
}
|
||||
|
||||
found := false
|
||||
for _, item := range list.Items {
|
||||
if item.Label == "RealTimeApplication" {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if !found {
|
||||
t.Error("Expected RealTimeApplication in class suggestions")
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("Suggest Fields", func(t *testing.T) {
|
||||
setup()
|
||||
content := `
|
||||
+MyApp = {
|
||||
Class = RealTimeApplication
|
||||
|
||||
}
|
||||
`
|
||||
lsp.Documents[uri] = content
|
||||
p := parser.NewParser(content)
|
||||
cfg, _ := p.Parse()
|
||||
lsp.Tree.AddFile(path, cfg)
|
||||
|
||||
// Position at line 3 (empty line inside MyApp)
|
||||
params := lsp.CompletionParams{
|
||||
TextDocument: lsp.TextDocumentIdentifier{URI: uri},
|
||||
Position: lsp.Position{Line: 3, Character: 4},
|
||||
}
|
||||
|
||||
list := lsp.HandleCompletion(params)
|
||||
if list == nil || len(list.Items) == 0 {
|
||||
t.Fatal("Expected field suggestions, got none")
|
||||
}
|
||||
|
||||
foundData := false
|
||||
for _, item := range list.Items {
|
||||
if item.Label == "Data" {
|
||||
foundData = true
|
||||
if item.Detail != "Mandatory" {
|
||||
t.Errorf("Expected Data to be Mandatory, got %s", item.Detail)
|
||||
}
|
||||
}
|
||||
}
|
||||
if !foundData {
|
||||
t.Error("Expected 'Data' in field suggestions for RealTimeApplication")
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("Suggest References (DataSource)", func(t *testing.T) {
|
||||
setup()
|
||||
content := `
|
||||
$App = {
|
||||
$Data = {
|
||||
+InDS = {
|
||||
Class = FileReader
|
||||
+Signals = {
|
||||
Sig1 = { Type = uint32 }
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
+MyGAM = {
|
||||
Class = IOGAM
|
||||
+InputSignals = {
|
||||
S1 = { DataSource = }
|
||||
}
|
||||
}
|
||||
`
|
||||
lsp.Documents[uri] = content
|
||||
p := parser.NewParser(content)
|
||||
cfg, _ := p.Parse()
|
||||
lsp.Tree.AddFile(path, cfg)
|
||||
lsp.Tree.ResolveReferences()
|
||||
|
||||
// Position at end of "DataSource = "
|
||||
params := lsp.CompletionParams{
|
||||
TextDocument: lsp.TextDocumentIdentifier{URI: uri},
|
||||
Position: lsp.Position{Line: 14, Character: 28},
|
||||
}
|
||||
|
||||
list := lsp.HandleCompletion(params)
|
||||
if list == nil || len(list.Items) == 0 {
|
||||
t.Fatal("Expected DataSource suggestions, got none")
|
||||
}
|
||||
|
||||
foundDS := false
|
||||
for _, item := range list.Items {
|
||||
if item.Label == "InDS" {
|
||||
foundDS = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if !foundDS {
|
||||
t.Error("Expected 'InDS' in suggestions for DataSource field")
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("Filter Existing Fields", func(t *testing.T) {
|
||||
setup()
|
||||
content := `
|
||||
+MyThread = {
|
||||
Class = RealTimeThread
|
||||
Functions = { }
|
||||
|
||||
}
|
||||
`
|
||||
lsp.Documents[uri] = content
|
||||
p := parser.NewParser(content)
|
||||
cfg, _ := p.Parse()
|
||||
lsp.Tree.AddFile(path, cfg)
|
||||
|
||||
// Position at line 4
|
||||
params := lsp.CompletionParams{
|
||||
TextDocument: lsp.TextDocumentIdentifier{URI: uri},
|
||||
Position: lsp.Position{Line: 4, Character: 4},
|
||||
}
|
||||
|
||||
list := lsp.HandleCompletion(params)
|
||||
for _, item := range list.Items {
|
||||
if item.Label == "Functions" || item.Label == "Class" {
|
||||
t.Errorf("Did not expect already defined field %s in suggestions", item.Label)
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("Scope-aware suggestions", func(t *testing.T) {
|
||||
setup()
|
||||
// Define a project DataSource in one file
|
||||
cfg1, _ := parser.NewParser("#package MYPROJ.Data\n+ProjectDS = { Class = FileReader +Signals = { S1 = { Type = int32 } } }").Parse()
|
||||
lsp.Tree.AddFile("project_ds.marte", cfg1)
|
||||
|
||||
// Define an isolated file
|
||||
contentIso := "+MyGAM = { Class = IOGAM +InputSignals = { S1 = { DataSource = } } }"
|
||||
lsp.Documents["file://iso.marte"] = contentIso
|
||||
cfg2, _ := parser.NewParser(contentIso).Parse()
|
||||
lsp.Tree.AddFile("iso.marte", cfg2)
|
||||
|
||||
lsp.Tree.ResolveReferences()
|
||||
|
||||
// Completion in isolated file
|
||||
params := lsp.CompletionParams{
|
||||
TextDocument: lsp.TextDocumentIdentifier{URI: "file://iso.marte"},
|
||||
Position: lsp.Position{Line: 0, Character: strings.Index(contentIso, "DataSource = ") + len("DataSource = ") + 1},
|
||||
}
|
||||
|
||||
list := lsp.HandleCompletion(params)
|
||||
foundProjectDS := false
|
||||
if list != nil {
|
||||
for _, item := range list.Items {
|
||||
if item.Label == "ProjectDS" {
|
||||
foundProjectDS = true
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
if foundProjectDS {
|
||||
t.Error("Did not expect ProjectDS in isolated file suggestions")
|
||||
}
|
||||
|
||||
// Completion in a project file
|
||||
lineContent := "+MyGAM = { Class = IOGAM +InputSignals = { S1 = { DataSource = Dummy } } }"
|
||||
contentPrj := "#package MYPROJ.App\n" + lineContent
|
||||
lsp.Documents["file://prj.marte"] = contentPrj
|
||||
pPrj := parser.NewParser(contentPrj)
|
||||
cfg3, err := pPrj.Parse()
|
||||
if err != nil {
|
||||
t.Logf("Parser error in contentPrj: %v", err)
|
||||
}
|
||||
lsp.Tree.AddFile("prj.marte", cfg3)
|
||||
lsp.Tree.ResolveReferences()
|
||||
|
||||
paramsPrj := lsp.CompletionParams{
|
||||
TextDocument: lsp.TextDocumentIdentifier{URI: "file://prj.marte"},
|
||||
Position: lsp.Position{Line: 1, Character: strings.Index(lineContent, "Dummy")},
|
||||
}
|
||||
|
||||
listPrj := lsp.HandleCompletion(paramsPrj)
|
||||
foundProjectDS = false
|
||||
if listPrj != nil {
|
||||
for _, item := range listPrj.Items {
|
||||
if item.Label == "ProjectDS" {
|
||||
foundProjectDS = true
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
if !foundProjectDS {
|
||||
t.Error("Expected ProjectDS in project file suggestions")
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("Suggest Signal Types", func(t *testing.T) {
|
||||
setup()
|
||||
content := `
|
||||
+DS = {
|
||||
Class = FileReader
|
||||
Signals = {
|
||||
S1 = { Type = }
|
||||
}
|
||||
}
|
||||
`
|
||||
lsp.Documents[uri] = content
|
||||
p := parser.NewParser(content)
|
||||
cfg, _ := p.Parse()
|
||||
lsp.Tree.AddFile(path, cfg)
|
||||
|
||||
params := lsp.CompletionParams{
|
||||
TextDocument: lsp.TextDocumentIdentifier{URI: uri},
|
||||
Position: lsp.Position{Line: 4, Character: strings.Index(content, "Type = ") + len("Type = ") + 1},
|
||||
}
|
||||
|
||||
list := lsp.HandleCompletion(params)
|
||||
if list == nil {
|
||||
t.Fatal("Expected signal type suggestions")
|
||||
}
|
||||
|
||||
foundUint32 := false
|
||||
for _, item := range list.Items {
|
||||
if item.Label == "uint32" {
|
||||
foundUint32 = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if !foundUint32 {
|
||||
t.Error("Expected uint32 in suggestions")
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("Suggest CUE Enums", func(t *testing.T) {
|
||||
setup()
|
||||
// Inject custom schema with enum
|
||||
custom := []byte(`
|
||||
package schema
|
||||
#Classes: {
|
||||
TestEnumClass: {
|
||||
Mode: "Auto" | "Manual"
|
||||
}
|
||||
}
|
||||
`)
|
||||
val := lsp.GlobalSchema.Context.CompileBytes(custom)
|
||||
lsp.GlobalSchema.Value = lsp.GlobalSchema.Value.Unify(val)
|
||||
|
||||
content := `
|
||||
+Obj = {
|
||||
Class = TestEnumClass
|
||||
Mode =
|
||||
}
|
||||
`
|
||||
lsp.Documents[uri] = content
|
||||
p := parser.NewParser(content)
|
||||
cfg, _ := p.Parse()
|
||||
lsp.Tree.AddFile(path, cfg)
|
||||
|
||||
params := lsp.CompletionParams{
|
||||
TextDocument: lsp.TextDocumentIdentifier{URI: uri},
|
||||
Position: lsp.Position{Line: 3, Character: strings.Index(content, "Mode = ") + len("Mode = ") + 1},
|
||||
}
|
||||
|
||||
list := lsp.HandleCompletion(params)
|
||||
if list == nil {
|
||||
t.Fatal("Expected enum suggestions")
|
||||
}
|
||||
|
||||
foundAuto := false
|
||||
for _, item := range list.Items {
|
||||
if item.Label == "\"Auto\"" { // CUE string value includes quotes
|
||||
foundAuto = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if !foundAuto {
|
||||
// Check if it returned without quotes?
|
||||
// v.String() returns quoted for string.
|
||||
t.Error("Expected \"Auto\" in suggestions")
|
||||
for _, item := range list.Items {
|
||||
t.Logf("Suggestion: %s", item.Label)
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
@@ -3,8 +3,8 @@ package integration
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
)
|
||||
|
||||
func TestLSPHoverDoc(t *testing.T) {
|
||||
@@ -30,28 +30,28 @@ func TestLSPHoverDoc(t *testing.T) {
|
||||
file := "doc.marte"
|
||||
idx.AddFile(file, config)
|
||||
idx.ResolveReferences()
|
||||
|
||||
|
||||
// Test 1: Hover over +MyObject definition
|
||||
res := idx.Query(file, 4, 2) // Line 4: +MyObject
|
||||
if res == nil || res.Node == nil {
|
||||
t.Fatal("Query failed for definition")
|
||||
}
|
||||
|
||||
|
||||
expectedDoc := "Object Documentation\nSecond line"
|
||||
if res.Node.Doc != expectedDoc {
|
||||
t.Errorf("Expected definition doc:\n%q\nGot:\n%q", expectedDoc, res.Node.Doc)
|
||||
}
|
||||
|
||||
|
||||
// Test 2: Hover over MyObject reference
|
||||
resRef := idx.Query(file, 10, 16) // Line 10: RefField = MyObject
|
||||
if resRef == nil || resRef.Reference == nil {
|
||||
t.Fatal("Query failed for reference")
|
||||
}
|
||||
|
||||
|
||||
if resRef.Reference.Target == nil {
|
||||
t.Fatal("Reference target not resolved")
|
||||
}
|
||||
|
||||
|
||||
if resRef.Reference.Target.Doc != expectedDoc {
|
||||
t.Errorf("Expected reference target definition doc:\n%q\nGot:\n%q", expectedDoc, resRef.Reference.Target.Doc)
|
||||
}
|
||||
|
||||
73
test/lsp_hover_context_test.go
Normal file
73
test/lsp_hover_context_test.go
Normal file
@@ -0,0 +1,73 @@
|
||||
package integration
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
)
|
||||
|
||||
func TestGetNodeContaining(t *testing.T) {
|
||||
content := `
|
||||
+App = {
|
||||
Class = RealTimeApplication
|
||||
+State1 = {
|
||||
Class = RealTimeState
|
||||
+Thread1 = {
|
||||
Class = RealTimeThread
|
||||
Functions = { GAM1 }
|
||||
}
|
||||
}
|
||||
}
|
||||
+GAM1 = { Class = IOGAM }
|
||||
`
|
||||
p := parser.NewParser(content)
|
||||
config, err := p.Parse()
|
||||
if err != nil {
|
||||
t.Fatalf("Parse failed: %v", err)
|
||||
}
|
||||
|
||||
idx := index.NewProjectTree()
|
||||
file := "hover_context.marte"
|
||||
idx.AddFile(file, config)
|
||||
idx.ResolveReferences()
|
||||
|
||||
// Find reference to GAM1
|
||||
var gamRef *index.Reference
|
||||
for i := range idx.References {
|
||||
ref := &idx.References[i]
|
||||
if ref.Name == "GAM1" {
|
||||
gamRef = ref
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if gamRef == nil {
|
||||
t.Fatal("Reference to GAM1 not found")
|
||||
}
|
||||
|
||||
// Check containing node
|
||||
container := idx.GetNodeContaining(file, gamRef.Position)
|
||||
if container == nil {
|
||||
t.Fatal("Container not found")
|
||||
}
|
||||
|
||||
if container.RealName != "+Thread1" {
|
||||
t.Errorf("Expected container +Thread1, got %s", container.RealName)
|
||||
}
|
||||
|
||||
// Check traversal up to State
|
||||
curr := container
|
||||
foundState := false
|
||||
for curr != nil {
|
||||
if curr.RealName == "+State1" {
|
||||
foundState = true
|
||||
break
|
||||
}
|
||||
curr = curr.Parent
|
||||
}
|
||||
|
||||
if !foundState {
|
||||
t.Error("State parent not found")
|
||||
}
|
||||
}
|
||||
199
test/lsp_server_test.go
Normal file
199
test/lsp_server_test.go
Normal file
@@ -0,0 +1,199 @@
|
||||
package integration
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/lsp"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
)
|
||||
|
||||
func TestInitProjectScan(t *testing.T) {
|
||||
// 1. Setup temp dir with files
|
||||
tmpDir, err := os.MkdirTemp("", "lsp_test")
|
||||
if err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
defer os.RemoveAll(tmpDir)
|
||||
|
||||
// File 1: Definition
|
||||
if err := os.WriteFile(filepath.Join(tmpDir, "def.marte"), []byte("#package Test.Common\n+Target = { Class = C }"), 0644); err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
// File 2: Reference
|
||||
if err := os.WriteFile(filepath.Join(tmpDir, "ref.marte"), []byte("#package Test.Common\n+Source = { Class = C Link = Target }"), 0644); err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
|
||||
// 2. Initialize
|
||||
lsp.Tree = index.NewProjectTree() // Reset global tree
|
||||
|
||||
initParams := lsp.InitializeParams{RootPath: tmpDir}
|
||||
paramsBytes, _ := json.Marshal(initParams)
|
||||
|
||||
msg := &lsp.JsonRpcMessage{
|
||||
Method: "initialize",
|
||||
Params: paramsBytes,
|
||||
ID: 1,
|
||||
}
|
||||
|
||||
lsp.HandleMessage(msg)
|
||||
|
||||
// Query the reference in ref.marte at "Target"
|
||||
defParams := lsp.DefinitionParams{
|
||||
TextDocument: lsp.TextDocumentIdentifier{URI: "file://" + filepath.Join(tmpDir, "ref.marte")},
|
||||
Position: lsp.Position{Line: 1, Character: 29},
|
||||
}
|
||||
|
||||
res := lsp.HandleDefinition(defParams)
|
||||
if res == nil {
|
||||
t.Fatal("Definition not found via LSP after initialization")
|
||||
}
|
||||
|
||||
locs, ok := res.([]lsp.Location)
|
||||
if !ok {
|
||||
t.Fatalf("Expected []lsp.Location, got %T", res)
|
||||
}
|
||||
|
||||
if len(locs) == 0 {
|
||||
t.Fatal("No locations found")
|
||||
}
|
||||
|
||||
// Verify uri points to def.marte
|
||||
expectedURI := "file://" + filepath.Join(tmpDir, "def.marte")
|
||||
if locs[0].URI != expectedURI {
|
||||
t.Errorf("Expected URI %s, got %s", expectedURI, locs[0].URI)
|
||||
}
|
||||
}
|
||||
|
||||
func TestHandleDefinition(t *testing.T) {
|
||||
// Reset tree for test
|
||||
lsp.Tree = index.NewProjectTree()
|
||||
|
||||
content := `
|
||||
+MyObject = {
|
||||
Class = Type
|
||||
}
|
||||
+RefObject = {
|
||||
Class = Type
|
||||
RefField = MyObject
|
||||
}
|
||||
`
|
||||
path := "/test.marte"
|
||||
p := parser.NewParser(content)
|
||||
config, err := p.Parse()
|
||||
if err != nil {
|
||||
t.Fatalf("Parse failed: %v", err)
|
||||
}
|
||||
lsp.Tree.AddFile(path, config)
|
||||
lsp.Tree.ResolveReferences()
|
||||
|
||||
t.Logf("Refs: %d", len(lsp.Tree.References))
|
||||
for _, r := range lsp.Tree.References {
|
||||
t.Logf(" %s at %d:%d", r.Name, r.Position.Line, r.Position.Column)
|
||||
}
|
||||
|
||||
// Test Go to Definition on MyObject reference
|
||||
params := lsp.DefinitionParams{
|
||||
TextDocument: lsp.TextDocumentIdentifier{URI: "file://" + path},
|
||||
Position: lsp.Position{Line: 6, Character: 15}, // "MyObject" in RefField = MyObject
|
||||
}
|
||||
|
||||
result := lsp.HandleDefinition(params)
|
||||
if result == nil {
|
||||
t.Fatal("HandleDefinition returned nil")
|
||||
}
|
||||
|
||||
locations, ok := result.([]lsp.Location)
|
||||
if !ok {
|
||||
t.Fatalf("Expected []lsp.Location, got %T", result)
|
||||
}
|
||||
|
||||
if len(locations) != 1 {
|
||||
t.Fatalf("Expected 1 location, got %d", len(locations))
|
||||
}
|
||||
|
||||
if locations[0].Range.Start.Line != 1 { // +MyObject is on line 2 (0-indexed 1)
|
||||
t.Errorf("Expected definition on line 1, got %d", locations[0].Range.Start.Line)
|
||||
}
|
||||
}
|
||||
|
||||
func TestHandleReferences(t *testing.T) {
|
||||
// Reset tree for test
|
||||
lsp.Tree = index.NewProjectTree()
|
||||
|
||||
content := `
|
||||
+MyObject = {
|
||||
Class = Type
|
||||
}
|
||||
+RefObject = {
|
||||
Class = Type
|
||||
RefField = MyObject
|
||||
}
|
||||
+AnotherRef = {
|
||||
Ref = MyObject
|
||||
}
|
||||
`
|
||||
path := "/test.marte"
|
||||
p := parser.NewParser(content)
|
||||
config, err := p.Parse()
|
||||
if err != nil {
|
||||
t.Fatalf("Parse failed: %v", err)
|
||||
}
|
||||
lsp.Tree.AddFile(path, config)
|
||||
lsp.Tree.ResolveReferences()
|
||||
|
||||
// Test Find References for MyObject (triggered from its definition)
|
||||
params := lsp.ReferenceParams{
|
||||
TextDocument: lsp.TextDocumentIdentifier{URI: "file://" + path},
|
||||
Position: lsp.Position{Line: 1, Character: 1}, // "+MyObject"
|
||||
Context: lsp.ReferenceContext{IncludeDeclaration: true},
|
||||
}
|
||||
|
||||
locations := lsp.HandleReferences(params)
|
||||
if len(locations) != 3 { // 1 declaration + 2 references
|
||||
t.Fatalf("Expected 3 locations, got %d", len(locations))
|
||||
}
|
||||
}
|
||||
|
||||
func TestLSPFormatting(t *testing.T) {
|
||||
// Setup
|
||||
content := `
|
||||
#package Proj.Main
|
||||
+Object={
|
||||
Field=1
|
||||
}
|
||||
`
|
||||
uri := "file:///test.marte"
|
||||
|
||||
// Open (populate Documents map)
|
||||
lsp.Documents[uri] = content
|
||||
|
||||
// Format
|
||||
params := lsp.DocumentFormattingParams{
|
||||
TextDocument: lsp.TextDocumentIdentifier{URI: uri},
|
||||
}
|
||||
|
||||
edits := lsp.HandleFormatting(params)
|
||||
|
||||
if len(edits) != 1 {
|
||||
t.Fatalf("Expected 1 edit, got %d", len(edits))
|
||||
}
|
||||
|
||||
newText := edits[0].NewText
|
||||
|
||||
expected := `#package Proj.Main
|
||||
|
||||
+Object = {
|
||||
Field = 1
|
||||
}
|
||||
`
|
||||
// Normalize newlines for comparison just in case
|
||||
if strings.TrimSpace(strings.ReplaceAll(newText, "\r\n", "\n")) != strings.TrimSpace(strings.ReplaceAll(expected, "\r\n", "\n")) {
|
||||
t.Errorf("Formatting mismatch.\nExpected:\n%s\nGot:\n%s", expected, newText)
|
||||
}
|
||||
}
|
||||
@@ -3,18 +3,32 @@ package integration
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func TestLSPSignalMetadata(t *testing.T) {
|
||||
func TestLSPSignalReferences(t *testing.T) {
|
||||
content := `
|
||||
+MySignal = {
|
||||
Class = Signal
|
||||
Type = uint32
|
||||
NumberOfElements = 10
|
||||
NumberOfDimensions = 1
|
||||
DataSource = DDB1
|
||||
+Data = {
|
||||
Class = ReferenceContainer
|
||||
+MyDS = {
|
||||
Class = FileReader
|
||||
Filename = "test"
|
||||
Signals = {
|
||||
MySig = { Type = uint32 }
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
+MyGAM = {
|
||||
Class = IOGAM
|
||||
InputSignals = {
|
||||
MySig = {
|
||||
DataSource = MyDS
|
||||
Type = uint32
|
||||
}
|
||||
}
|
||||
}
|
||||
`
|
||||
p := parser.NewParser(content)
|
||||
@@ -24,26 +38,56 @@ func TestLSPSignalMetadata(t *testing.T) {
|
||||
}
|
||||
|
||||
idx := index.NewProjectTree()
|
||||
file := "signal.marte"
|
||||
idx.AddFile(file, config)
|
||||
|
||||
res := idx.Query(file, 2, 2) // Query +MySignal
|
||||
if res == nil || res.Node == nil {
|
||||
t.Fatal("Query failed for signal definition")
|
||||
idx.AddFile("signal_refs.marte", config)
|
||||
idx.ResolveReferences()
|
||||
|
||||
v := validator.NewValidator(idx, ".")
|
||||
v.ValidateProject()
|
||||
|
||||
// Find definition of MySig in MyDS
|
||||
root := idx.IsolatedFiles["signal_refs.marte"]
|
||||
if root == nil {
|
||||
t.Fatal("Root node not found")
|
||||
}
|
||||
|
||||
meta := res.Node.Metadata
|
||||
if meta["Class"] != "Signal" {
|
||||
t.Errorf("Expected Class Signal, got %s", meta["Class"])
|
||||
|
||||
// Traverse to MySig
|
||||
dataNode := root.Children["Data"]
|
||||
if dataNode == nil {
|
||||
t.Fatal("Data node not found")
|
||||
}
|
||||
if meta["Type"] != "uint32" {
|
||||
t.Errorf("Expected Type uint32, got %s", meta["Type"])
|
||||
|
||||
myDS := dataNode.Children["MyDS"]
|
||||
if myDS == nil {
|
||||
t.Fatal("MyDS node not found")
|
||||
}
|
||||
if meta["NumberOfElements"] != "10" {
|
||||
t.Errorf("Expected 10 elements, got %s", meta["NumberOfElements"])
|
||||
|
||||
signals := myDS.Children["Signals"]
|
||||
if signals == nil {
|
||||
t.Fatal("Signals node not found")
|
||||
}
|
||||
|
||||
mySigDef := signals.Children["MySig"]
|
||||
if mySigDef == nil {
|
||||
t.Fatal("Definition of MySig not found in tree")
|
||||
}
|
||||
|
||||
// Now simulate "Find References" on mySigDef
|
||||
foundRefs := 0
|
||||
idx.Walk(func(node *index.ProjectNode) {
|
||||
if node.Target == mySigDef {
|
||||
foundRefs++
|
||||
// Check if node is the GAM signal
|
||||
if node.RealName != "MySig" { // In GAM it is MySig
|
||||
t.Errorf("Unexpected reference node name: %s", node.RealName)
|
||||
}
|
||||
// Check parent is InputSignals -> MyGAM
|
||||
if node.Parent == nil || node.Parent.Parent == nil || node.Parent.Parent.RealName != "+MyGAM" {
|
||||
t.Errorf("Reference node not in MyGAM")
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
if foundRefs != 1 {
|
||||
t.Errorf("Expected 1 reference (Direct), found %d", foundRefs)
|
||||
}
|
||||
|
||||
// Since handleHover logic is in internal/lsp which we can't easily test directly without
|
||||
// exposing formatNodeInfo, we rely on the fact that Metadata is populated correctly.
|
||||
// If Metadata is correct, server.go logic (verified by code review) should display it.
|
||||
}
|
||||
|
||||
@@ -4,9 +4,9 @@ import (
|
||||
"io/ioutil"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
// Helper to load and parse a file
|
||||
@@ -26,14 +26,14 @@ func loadConfig(t *testing.T, filename string) *parser.Configuration {
|
||||
func TestLSPDiagnostics(t *testing.T) {
|
||||
inputFile := "integration/check_dup.marte"
|
||||
config := loadConfig(t, inputFile)
|
||||
|
||||
|
||||
// Simulate LSP logic: Build Index -> Validate
|
||||
idx := index.NewProjectTree()
|
||||
idx.AddFile(inputFile, config)
|
||||
|
||||
|
||||
v := validator.NewValidator(idx, ".")
|
||||
v.ValidateProject()
|
||||
|
||||
|
||||
// Check for expected diagnostics
|
||||
found := false
|
||||
for _, d := range v.Diagnostics {
|
||||
@@ -51,7 +51,7 @@ func TestLSPDiagnostics(t *testing.T) {
|
||||
}
|
||||
|
||||
// For GoToDefinition and References, we need to test the Indexer's ability to resolve symbols.
|
||||
// Currently, my Indexer (ProjectTree) stores structure but doesn't explicitly track
|
||||
// Currently, my Indexer (ProjectTree) stores structure but doesn't explicitly track
|
||||
// "references" in a way that maps a source position to a target symbol yet.
|
||||
// The ProjectTree is built for structure merging.
|
||||
// To support LSP "Go To Definition", we need to map usage -> definition.
|
||||
@@ -63,7 +63,7 @@ func TestLSPDiagnostics(t *testing.T) {
|
||||
// Previously (before rewrite), `index.go` had `References []Reference`.
|
||||
// I removed it during the rewrite to ProjectTree!
|
||||
|
||||
// I need to re-implement reference tracking in `ProjectTree` or a parallel structure
|
||||
// I need to re-implement reference tracking in `ProjectTree` or a parallel structure
|
||||
// to support LSP features.
|
||||
func TestLSPDefinition(t *testing.T) {
|
||||
// Create a virtual file content with a definition and a reference
|
||||
@@ -94,15 +94,15 @@ func TestLSPDefinition(t *testing.T) {
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
if foundRef == nil {
|
||||
t.Fatal("Reference to MyObject not found in index")
|
||||
}
|
||||
|
||||
|
||||
if foundRef.Target == nil {
|
||||
t.Fatal("Reference to MyObject was not resolved to a target")
|
||||
}
|
||||
|
||||
|
||||
if foundRef.Target.RealName != "+MyObject" {
|
||||
t.Errorf("Expected target to be +MyObject, got %s", foundRef.Target.RealName)
|
||||
}
|
||||
@@ -123,20 +123,33 @@ func TestLSPHover(t *testing.T) {
|
||||
idx := index.NewProjectTree()
|
||||
file := "hover.marte"
|
||||
idx.AddFile(file, config)
|
||||
|
||||
|
||||
// +MyObject is at line 2.
|
||||
// Query at line 2, col 2 (on 'M' of MyObject)
|
||||
res := idx.Query(file, 2, 2)
|
||||
|
||||
|
||||
if res == nil {
|
||||
t.Fatal("Query returned nil")
|
||||
}
|
||||
|
||||
|
||||
if res.Node == nil {
|
||||
t.Fatal("Expected Node result")
|
||||
}
|
||||
|
||||
|
||||
if res.Node.RealName != "+MyObject" {
|
||||
t.Errorf("Expected +MyObject, got %s", res.Node.RealName)
|
||||
}
|
||||
}
|
||||
|
||||
func TestParserError(t *testing.T) {
|
||||
invalidContent := `
|
||||
A = {
|
||||
Field =
|
||||
}
|
||||
`
|
||||
p := parser.NewParser(invalidContent)
|
||||
_, err := p.Parse()
|
||||
if err == nil {
|
||||
t.Fatal("Expected parser error, got nil")
|
||||
}
|
||||
}
|
||||
|
||||
35
test/parser_strictness_test.go
Normal file
35
test/parser_strictness_test.go
Normal file
@@ -0,0 +1,35 @@
|
||||
package integration
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
)
|
||||
|
||||
func TestParserStrictness(t *testing.T) {
|
||||
// Case 1: content not a definition (missing =)
|
||||
invalidDef := `
|
||||
A = {
|
||||
Field = 10
|
||||
XXX
|
||||
}
|
||||
`
|
||||
p := parser.NewParser(invalidDef)
|
||||
_, err := p.Parse()
|
||||
if err == nil {
|
||||
t.Error("Expected error for invalid definition XXX, got nil")
|
||||
}
|
||||
|
||||
// Case 2: Missing closing bracket
|
||||
missingBrace := `
|
||||
A = {
|
||||
SUBNODE = {
|
||||
FIELD = 10
|
||||
}
|
||||
`
|
||||
p2 := parser.NewParser(missingBrace)
|
||||
_, err2 := p2.Parse()
|
||||
if err2 == nil {
|
||||
t.Error("Expected error for missing closing bracket, got nil")
|
||||
}
|
||||
}
|
||||
@@ -1,7 +1,9 @@
|
||||
package parser
|
||||
package integration
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
)
|
||||
|
||||
func TestParseBasic(t *testing.T) {
|
||||
@@ -22,7 +24,7 @@ $Node2 = {
|
||||
Array = {1 2 3}
|
||||
}
|
||||
`
|
||||
p := NewParser(input)
|
||||
p := parser.NewParser(input)
|
||||
config, err := p.Parse()
|
||||
if err != nil {
|
||||
t.Fatalf("Parse error: %v", err)
|
||||
@@ -4,9 +4,9 @@ import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func TestMDSWriterValidation(t *testing.T) {
|
||||
@@ -38,7 +38,7 @@ func TestMDSWriterValidation(t *testing.T) {
|
||||
|
||||
found := false
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "Missing mandatory field 'TreeName'") {
|
||||
if strings.Contains(d.Message, "TreeName: incomplete value") {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
@@ -71,7 +71,7 @@ func TestMathExpressionGAMValidation(t *testing.T) {
|
||||
|
||||
found := false
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "Missing mandatory field 'Expression'") {
|
||||
if strings.Contains(d.Message, "Expression: incomplete value") {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
|
||||
@@ -4,9 +4,9 @@ import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func TestPIDGAMValidation(t *testing.T) {
|
||||
@@ -35,10 +35,10 @@ func TestPIDGAMValidation(t *testing.T) {
|
||||
foundKd := false
|
||||
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "Missing mandatory field 'Ki'") {
|
||||
if strings.Contains(d.Message, "Ki: incomplete value") {
|
||||
foundKi = true
|
||||
}
|
||||
if strings.Contains(d.Message, "Missing mandatory field 'Kd'") {
|
||||
if strings.Contains(d.Message, "Kd: incomplete value") {
|
||||
foundKd = true
|
||||
}
|
||||
}
|
||||
@@ -73,7 +73,7 @@ func TestFileDataSourceValidation(t *testing.T) {
|
||||
|
||||
found := false
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "Missing mandatory field 'Filename'") {
|
||||
if strings.Contains(d.Message, "Filename: incomplete value") {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
|
||||
@@ -4,9 +4,9 @@ import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func TestRealTimeApplicationValidation(t *testing.T) {
|
||||
@@ -35,14 +35,20 @@ func TestRealTimeApplicationValidation(t *testing.T) {
|
||||
missingStates := false
|
||||
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "Missing mandatory field 'Data'") {
|
||||
if strings.Contains(d.Message, "Data: field is required") {
|
||||
missingData = true
|
||||
}
|
||||
if strings.Contains(d.Message, "Missing mandatory field 'States'") {
|
||||
if strings.Contains(d.Message, "States: field is required") {
|
||||
missingStates = true
|
||||
}
|
||||
}
|
||||
|
||||
if !missingData || !missingStates {
|
||||
for _, d := range v.Diagnostics {
|
||||
t.Logf("Diagnostic: %s", d.Message)
|
||||
}
|
||||
}
|
||||
|
||||
if !missingData {
|
||||
t.Error("Expected error for missing 'Data' field in RealTimeApplication")
|
||||
}
|
||||
@@ -73,7 +79,7 @@ func TestGAMSchedulerValidation(t *testing.T) {
|
||||
|
||||
found := false
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "Missing mandatory field 'TimingDataSource'") {
|
||||
if strings.Contains(d.Message, "TimingDataSource: incomplete value") {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
|
||||
@@ -4,9 +4,9 @@ import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func TestSDNSubscriberValidation(t *testing.T) {
|
||||
@@ -32,7 +32,7 @@ func TestSDNSubscriberValidation(t *testing.T) {
|
||||
|
||||
found := false
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "Missing mandatory field 'Port'") {
|
||||
if strings.Contains(d.Message, "Port: incomplete value") {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
@@ -65,7 +65,7 @@ func TestFileWriterValidation(t *testing.T) {
|
||||
|
||||
found := false
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "Missing mandatory field 'Filename'") {
|
||||
if strings.Contains(d.Message, "Filename: incomplete value") {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
|
||||
74
test/validator_functions_array_test.go
Normal file
74
test/validator_functions_array_test.go
Normal file
@@ -0,0 +1,74 @@
|
||||
package integration
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func TestFunctionsArrayValidation(t *testing.T) {
|
||||
content := `
|
||||
+App = {
|
||||
Class = RealTimeApplication
|
||||
+State = {
|
||||
Class = RealTimeState
|
||||
+Thread = {
|
||||
Class = RealTimeThread
|
||||
Functions = {
|
||||
ValidGAM,
|
||||
InvalidGAM, // Not a GAM (DataSource)
|
||||
MissingGAM, // Not found
|
||||
"String", // Not reference
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
+ValidGAM = { Class = IOGAM InputSignals = {} }
|
||||
+InvalidGAM = { Class = FileReader }
|
||||
`
|
||||
p := parser.NewParser(content)
|
||||
config, err := p.Parse()
|
||||
if err != nil {
|
||||
t.Fatalf("Parse failed: %v", err)
|
||||
}
|
||||
|
||||
idx := index.NewProjectTree()
|
||||
idx.AddFile("funcs.marte", config)
|
||||
idx.ResolveReferences()
|
||||
|
||||
v := validator.NewValidator(idx, ".")
|
||||
v.ValidateProject()
|
||||
|
||||
foundInvalid := false
|
||||
foundMissing := false
|
||||
foundNotRef := false
|
||||
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "not found or is not a valid GAM") {
|
||||
// This covers both InvalidGAM and MissingGAM cases
|
||||
if strings.Contains(d.Message, "InvalidGAM") {
|
||||
foundInvalid = true
|
||||
}
|
||||
if strings.Contains(d.Message, "MissingGAM") {
|
||||
foundMissing = true
|
||||
}
|
||||
}
|
||||
if strings.Contains(d.Message, "must contain references") {
|
||||
foundNotRef = true
|
||||
}
|
||||
}
|
||||
|
||||
if !foundInvalid {
|
||||
t.Error("Expected error for InvalidGAM")
|
||||
}
|
||||
if !foundMissing {
|
||||
t.Error("Expected error for MissingGAM")
|
||||
}
|
||||
if !foundNotRef {
|
||||
t.Error("Expected error for non-reference element")
|
||||
}
|
||||
}
|
||||
85
test/validator_gam_direction_test.go
Normal file
85
test/validator_gam_direction_test.go
Normal file
@@ -0,0 +1,85 @@
|
||||
package integration
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func TestGAMSignalDirectionality(t *testing.T) {
|
||||
content := `
|
||||
$App = {
|
||||
$Data = {
|
||||
+InDS = { Class = FileReader Filename="f" +Signals = { S1 = { Type = uint32 } } }
|
||||
+OutDS = { Class = FileWriter Filename="f" +Signals = { S1 = { Type = uint32 } } }
|
||||
+InOutDS = { Class = FileDataSource Filename="f" +Signals = { S1 = { Type = uint32 } } }
|
||||
}
|
||||
+ValidGAM = {
|
||||
Class = IOGAM
|
||||
InputSignals = {
|
||||
S1 = { DataSource = InDS }
|
||||
S2 = { DataSource = InOutDS Alias = S1 }
|
||||
}
|
||||
OutputSignals = {
|
||||
S3 = { DataSource = OutDS Alias = S1 }
|
||||
S4 = { DataSource = InOutDS Alias = S1 }
|
||||
}
|
||||
}
|
||||
+InvalidGAM = {
|
||||
Class = IOGAM
|
||||
InputSignals = {
|
||||
BadIn = { DataSource = OutDS Alias = S1 }
|
||||
}
|
||||
OutputSignals = {
|
||||
BadOut = { DataSource = InDS Alias = S1 }
|
||||
}
|
||||
}
|
||||
}
|
||||
`
|
||||
p := parser.NewParser(content)
|
||||
config, err := p.Parse()
|
||||
if err != nil {
|
||||
t.Fatalf("Parse failed: %v", err)
|
||||
}
|
||||
|
||||
idx := index.NewProjectTree()
|
||||
idx.AddFile("dir.marte", config)
|
||||
idx.ResolveReferences()
|
||||
|
||||
v := validator.NewValidator(idx, ".")
|
||||
v.ValidateProject()
|
||||
|
||||
// Check ValidGAM has NO directionality errors
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "is Output-only but referenced in InputSignals") ||
|
||||
strings.Contains(d.Message, "is Input-only but referenced in OutputSignals") {
|
||||
if strings.Contains(d.Message, "ValidGAM") {
|
||||
t.Errorf("Unexpected direction error for ValidGAM: %s", d.Message)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Check InvalidGAM HAS errors
|
||||
foundBadIn := false
|
||||
foundBadOut := false
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "InvalidGAM") {
|
||||
if strings.Contains(d.Message, "is Output-only but referenced in InputSignals") {
|
||||
foundBadIn = true
|
||||
}
|
||||
if strings.Contains(d.Message, "is Input-only but referenced in OutputSignals") {
|
||||
foundBadOut = true
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if !foundBadIn {
|
||||
t.Error("Expected error for OutDS in InputSignals of InvalidGAM")
|
||||
}
|
||||
if !foundBadOut {
|
||||
t.Error("Expected error for InDS in OutputSignals of InvalidGAM")
|
||||
}
|
||||
}
|
||||
81
test/validator_gam_signals_linking_test.go
Normal file
81
test/validator_gam_signals_linking_test.go
Normal file
@@ -0,0 +1,81 @@
|
||||
package integration
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func TestGAMSignalLinking(t *testing.T) {
|
||||
content := `
|
||||
+Data = {
|
||||
Class = ReferenceContainer
|
||||
+MyDS = {
|
||||
Class = FileReader
|
||||
Filename = "test.txt"
|
||||
Signals = {
|
||||
MySig = { Type = uint32 }
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
+MyGAM = {
|
||||
Class = IOGAM
|
||||
InputSignals = {
|
||||
MySig = {
|
||||
DataSource = MyDS
|
||||
Type = uint32
|
||||
}
|
||||
AliasedSig = {
|
||||
Alias = MySig
|
||||
DataSource = MyDS
|
||||
Type = uint32
|
||||
}
|
||||
}
|
||||
}
|
||||
`
|
||||
p := parser.NewParser(content)
|
||||
config, err := p.Parse()
|
||||
if err != nil {
|
||||
t.Fatalf("Parse failed: %v", err)
|
||||
}
|
||||
|
||||
idx := index.NewProjectTree()
|
||||
idx.AddFile("gam_signals_linking.marte", config)
|
||||
idx.ResolveReferences()
|
||||
|
||||
v := validator.NewValidator(idx, ".")
|
||||
v.ValidateProject()
|
||||
|
||||
if len(v.Diagnostics) > 0 {
|
||||
for _, d := range v.Diagnostics {
|
||||
t.Logf("Diagnostic: %s", d.Message)
|
||||
}
|
||||
t.Fatalf("Validation failed with %d issues", len(v.Diagnostics))
|
||||
}
|
||||
|
||||
foundMyDSRef := 0
|
||||
foundAliasRef := 0
|
||||
|
||||
for _, ref := range idx.References {
|
||||
if ref.Name == "MyDS" {
|
||||
if ref.Target != nil && ref.Target.RealName == "+MyDS" {
|
||||
foundMyDSRef++
|
||||
}
|
||||
}
|
||||
if ref.Name == "MySig" {
|
||||
if ref.Target != nil && ref.Target.RealName == "MySig" {
|
||||
foundAliasRef++
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if foundMyDSRef < 2 {
|
||||
t.Errorf("Expected at least 2 resolved MyDS references, found %d", foundMyDSRef)
|
||||
}
|
||||
if foundAliasRef < 1 {
|
||||
t.Errorf("Expected at least 1 resolved Alias MySig reference, found %d", foundAliasRef)
|
||||
}
|
||||
}
|
||||
108
test/validator_gam_signals_test.go
Normal file
108
test/validator_gam_signals_test.go
Normal file
@@ -0,0 +1,108 @@
|
||||
package integration
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func TestGAMSignalValidation(t *testing.T) {
|
||||
content := `
|
||||
+Data = {
|
||||
Class = ReferenceContainer
|
||||
+InDS = {
|
||||
Class = FileReader
|
||||
Signals = {
|
||||
SigIn = { Type = uint32 }
|
||||
}
|
||||
}
|
||||
+OutDS = {
|
||||
Class = FileWriter
|
||||
Signals = {
|
||||
SigOut = { Type = uint32 }
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
+MyGAM = {
|
||||
Class = IOGAM
|
||||
InputSignals = {
|
||||
SigIn = {
|
||||
DataSource = InDS
|
||||
Type = uint32
|
||||
}
|
||||
// Error: OutDS is OUT only
|
||||
BadInput = {
|
||||
DataSource = OutDS
|
||||
Alias = SigOut
|
||||
Type = uint32
|
||||
}
|
||||
// Error: MissingSig not in InDS
|
||||
Missing = {
|
||||
DataSource = InDS
|
||||
Alias = MissingSig
|
||||
Type = uint32
|
||||
}
|
||||
}
|
||||
OutputSignals = {
|
||||
SigOut = {
|
||||
DataSource = OutDS
|
||||
Type = uint32
|
||||
}
|
||||
// Error: InDS is IN only
|
||||
BadOutput = {
|
||||
DataSource = InDS
|
||||
Alias = SigIn
|
||||
Type = uint32
|
||||
}
|
||||
}
|
||||
}
|
||||
`
|
||||
p := parser.NewParser(content)
|
||||
config, err := p.Parse()
|
||||
if err != nil {
|
||||
t.Fatalf("Parse failed: %v", err)
|
||||
}
|
||||
|
||||
idx := index.NewProjectTree()
|
||||
idx.AddFile("gam_signals.marte", config)
|
||||
idx.ResolveReferences()
|
||||
|
||||
v := validator.NewValidator(idx, ".")
|
||||
v.ValidateProject()
|
||||
|
||||
foundBadInput := false
|
||||
foundMissing := false
|
||||
foundBadOutput := false
|
||||
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "DataSource 'OutDS' (Class FileWriter) is Output-only but referenced in InputSignals") {
|
||||
foundBadInput = true
|
||||
}
|
||||
if strings.Contains(d.Message, "Implicitly Defined Signal: 'MissingSig'") {
|
||||
foundMissing = true
|
||||
}
|
||||
if strings.Contains(d.Message, "DataSource 'InDS' (Class FileReader) is Input-only but referenced in OutputSignals") {
|
||||
foundBadOutput = true
|
||||
}
|
||||
}
|
||||
|
||||
if !foundBadInput || !foundMissing || !foundBadOutput {
|
||||
for _, d := range v.Diagnostics {
|
||||
t.Logf("Diagnostic: %s", d.Message)
|
||||
}
|
||||
}
|
||||
|
||||
if !foundBadInput {
|
||||
t.Error("Expected error for OutDS in InputSignals")
|
||||
}
|
||||
if !foundMissing {
|
||||
t.Error("Expected error for missing signal reference")
|
||||
}
|
||||
if !foundBadOutput {
|
||||
t.Error("Expected error for InDS in OutputSignals")
|
||||
}
|
||||
}
|
||||
65
test/validator_global_pragma_debug_test.go
Normal file
65
test/validator_global_pragma_debug_test.go
Normal file
@@ -0,0 +1,65 @@
|
||||
package integration
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func TestGlobalPragmaDebug(t *testing.T) {
|
||||
content := `//! allow(implicit): Debugging
|
||||
//! allow(unused): Debugging
|
||||
+Data={Class=ReferenceContainer}
|
||||
+GAM={Class=IOGAM InputSignals={Impl={DataSource=Data Type=uint32}}}
|
||||
+UnusedGAM={Class=IOGAM}
|
||||
`
|
||||
p := parser.NewParser(content)
|
||||
config, err := p.Parse()
|
||||
if err != nil {
|
||||
t.Fatalf("Parse failed: %v", err)
|
||||
}
|
||||
|
||||
// Check if pragma parsed
|
||||
if len(config.Pragmas) == 0 {
|
||||
t.Fatal("Pragma not parsed")
|
||||
}
|
||||
t.Logf("Parsed Pragma 0: %s", config.Pragmas[0].Text)
|
||||
|
||||
idx := index.NewProjectTree()
|
||||
idx.AddFile("debug.marte", config)
|
||||
idx.ResolveReferences()
|
||||
|
||||
// Check if added to GlobalPragmas
|
||||
pragmas, ok := idx.GlobalPragmas["debug.marte"]
|
||||
if !ok || len(pragmas) == 0 {
|
||||
t.Fatal("GlobalPragmas not populated")
|
||||
}
|
||||
t.Logf("Global Pragma stored: %s", pragmas[0])
|
||||
|
||||
v := validator.NewValidator(idx, ".")
|
||||
v.ValidateProject()
|
||||
v.CheckUnused() // Must call this for unused check!
|
||||
|
||||
foundImplicitWarning := false
|
||||
foundUnusedWarning := false
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "Implicitly Defined Signal") {
|
||||
foundImplicitWarning = true
|
||||
t.Logf("Found warning: %s", d.Message)
|
||||
}
|
||||
if strings.Contains(d.Message, "Unused GAM") {
|
||||
foundUnusedWarning = true
|
||||
t.Logf("Found warning: %s", d.Message)
|
||||
}
|
||||
}
|
||||
|
||||
if foundImplicitWarning {
|
||||
t.Error("Expected implicit warning to be suppressed")
|
||||
}
|
||||
if foundUnusedWarning {
|
||||
t.Error("Expected unused warning to be suppressed")
|
||||
}
|
||||
}
|
||||
67
test/validator_global_pragma_test.go
Normal file
67
test/validator_global_pragma_test.go
Normal file
@@ -0,0 +1,67 @@
|
||||
package integration
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func TestGlobalPragma(t *testing.T) {
|
||||
content := `
|
||||
//!allow(unused): Suppress all unused
|
||||
//!allow(implicit): Suppress all implicit
|
||||
|
||||
+Data = {
|
||||
Class = ReferenceContainer
|
||||
+MyDS = {
|
||||
Class = FileReader
|
||||
Filename = "test"
|
||||
Signals = {
|
||||
UnusedSig = { Type = uint32 }
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
+MyGAM = {
|
||||
Class = IOGAM
|
||||
InputSignals = {
|
||||
ImplicitSig = { DataSource = MyDS Type = uint32 }
|
||||
}
|
||||
}
|
||||
`
|
||||
p := parser.NewParser(content)
|
||||
config, err := p.Parse()
|
||||
if err != nil {
|
||||
t.Fatalf("Parse failed: %v", err)
|
||||
}
|
||||
|
||||
idx := index.NewProjectTree()
|
||||
idx.AddFile("global_pragma.marte", config)
|
||||
idx.ResolveReferences()
|
||||
|
||||
v := validator.NewValidator(idx, ".")
|
||||
v.ValidateProject()
|
||||
v.CheckUnused()
|
||||
|
||||
foundUnusedWarning := false
|
||||
foundImplicitWarning := false
|
||||
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "Unused Signal") {
|
||||
foundUnusedWarning = true
|
||||
}
|
||||
if strings.Contains(d.Message, "Implicitly Defined Signal") {
|
||||
foundImplicitWarning = true
|
||||
}
|
||||
}
|
||||
|
||||
if foundUnusedWarning {
|
||||
t.Error("Expected warning for UnusedSig to be suppressed globally")
|
||||
}
|
||||
if foundImplicitWarning {
|
||||
t.Error("Expected warning for ImplicitSig to be suppressed globally")
|
||||
}
|
||||
}
|
||||
75
test/validator_global_pragma_update_test.go
Normal file
75
test/validator_global_pragma_update_test.go
Normal file
@@ -0,0 +1,75 @@
|
||||
package integration
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func TestGlobalPragmaUpdate(t *testing.T) {
|
||||
// Scenario: Project scope. File A has pragma. File B has warning.
|
||||
|
||||
fileA := "fileA.marte"
|
||||
contentA_WithPragma := `
|
||||
#package my.project
|
||||
//!allow(unused): Suppress
|
||||
`
|
||||
contentA_NoPragma := `
|
||||
#package my.project
|
||||
// No pragma
|
||||
`
|
||||
|
||||
fileB := "fileB.marte"
|
||||
contentB := `
|
||||
#package my.project
|
||||
+Data={Class=ReferenceContainer +DS={Class=FileReader Filename="t" Signals={Unused={Type=uint32}}}}
|
||||
`
|
||||
|
||||
idx := index.NewProjectTree()
|
||||
|
||||
// Helper to validate
|
||||
check := func() bool {
|
||||
idx.ResolveReferences()
|
||||
v := validator.NewValidator(idx, ".")
|
||||
v.ValidateProject()
|
||||
v.CheckUnused()
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "Unused Signal") {
|
||||
return true // Found warning
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
// 1. Add A (with pragma) and B
|
||||
pA := parser.NewParser(contentA_WithPragma)
|
||||
cA, _ := pA.Parse()
|
||||
idx.AddFile(fileA, cA)
|
||||
|
||||
pB := parser.NewParser(contentB)
|
||||
cB, _ := pB.Parse()
|
||||
idx.AddFile(fileB, cB)
|
||||
|
||||
if check() {
|
||||
t.Error("Step 1: Expected warning to be suppressed")
|
||||
}
|
||||
|
||||
// 2. Update A (remove pragma)
|
||||
pA2 := parser.NewParser(contentA_NoPragma)
|
||||
cA2, _ := pA2.Parse()
|
||||
idx.AddFile(fileA, cA2)
|
||||
|
||||
if !check() {
|
||||
t.Error("Step 2: Expected warning to appear")
|
||||
}
|
||||
|
||||
// 3. Update A (add pragma back)
|
||||
idx.AddFile(fileA, cA) // Re-use config A
|
||||
|
||||
if check() {
|
||||
t.Error("Step 3: Expected warning to be suppressed again")
|
||||
}
|
||||
}
|
||||
59
test/validator_ignore_pragma_test.go
Normal file
59
test/validator_ignore_pragma_test.go
Normal file
@@ -0,0 +1,59 @@
|
||||
package integration
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func TestIgnorePragma(t *testing.T) {
|
||||
content := `
|
||||
//!ignore(unused): Suppress global unused
|
||||
+Data = {
|
||||
Class = ReferenceContainer
|
||||
+MyDS = {
|
||||
Class = FileReader
|
||||
Filename = "test"
|
||||
Signals = {
|
||||
Unused1 = { Type = uint32 }
|
||||
|
||||
//!ignore(unused): Suppress local unused
|
||||
Unused2 = { Type = uint32 }
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
+MyGAM = {
|
||||
Class = IOGAM
|
||||
InputSignals = {
|
||||
//!ignore(implicit): Suppress local implicit
|
||||
ImplicitSig = { DataSource = MyDS Type = uint32 }
|
||||
}
|
||||
}
|
||||
`
|
||||
p := parser.NewParser(content)
|
||||
config, err := p.Parse()
|
||||
if err != nil {
|
||||
t.Fatalf("Parse failed: %v", err)
|
||||
}
|
||||
|
||||
idx := index.NewProjectTree()
|
||||
idx.AddFile("ignore.marte", config)
|
||||
idx.ResolveReferences()
|
||||
|
||||
v := validator.NewValidator(idx, ".")
|
||||
v.ValidateProject()
|
||||
v.CheckUnused()
|
||||
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "Unused Signal") {
|
||||
t.Errorf("Unexpected warning: %s", d.Message)
|
||||
}
|
||||
if strings.Contains(d.Message, "Implicitly Defined Signal") {
|
||||
t.Errorf("Unexpected warning: %s", d.Message)
|
||||
}
|
||||
}
|
||||
}
|
||||
107
test/validator_implicit_signal_test.go
Normal file
107
test/validator_implicit_signal_test.go
Normal file
@@ -0,0 +1,107 @@
|
||||
package integration
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func TestImplicitSignal(t *testing.T) {
|
||||
content := `
|
||||
+Data = {
|
||||
Class = ReferenceContainer
|
||||
+MyDS = {
|
||||
Class = FileReader
|
||||
Filename = "test"
|
||||
Signals = {
|
||||
ExplicitSig = { Type = uint32 }
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
+MyGAM = {
|
||||
Class = IOGAM
|
||||
InputSignals = {
|
||||
ExplicitSig = {
|
||||
DataSource = MyDS
|
||||
Type = uint32
|
||||
}
|
||||
ImplicitSig = {
|
||||
DataSource = MyDS
|
||||
Type = uint32
|
||||
}
|
||||
}
|
||||
}
|
||||
`
|
||||
p := parser.NewParser(content)
|
||||
config, err := p.Parse()
|
||||
if err != nil {
|
||||
t.Fatalf("Parse failed: %v", err)
|
||||
}
|
||||
|
||||
idx := index.NewProjectTree()
|
||||
idx.AddFile("implicit_signal.marte", config)
|
||||
idx.ResolveReferences()
|
||||
|
||||
v := validator.NewValidator(idx, ".")
|
||||
v.ValidateProject()
|
||||
|
||||
foundWarning := false
|
||||
foundError := false
|
||||
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "Implicitly Defined Signal") {
|
||||
if strings.Contains(d.Message, "ImplicitSig") {
|
||||
foundWarning = true
|
||||
}
|
||||
}
|
||||
if strings.Contains(d.Message, "Signal 'ExplicitSig' not found") {
|
||||
foundError = true
|
||||
}
|
||||
}
|
||||
|
||||
if !foundWarning || foundError {
|
||||
for _, d := range v.Diagnostics {
|
||||
t.Logf("Diagnostic: %s", d.Message)
|
||||
}
|
||||
}
|
||||
|
||||
if !foundWarning {
|
||||
t.Error("Expected warning for ImplicitSig")
|
||||
}
|
||||
if foundError {
|
||||
t.Error("Unexpected error for ExplicitSig")
|
||||
}
|
||||
|
||||
// Test missing Type for implicit
|
||||
contentMissingType := `
|
||||
+Data = { Class = ReferenceContainer +DS={Class=FileReader Filename="" Signals={}} }
|
||||
+GAM = { Class = IOGAM InputSignals = { Impl = { DataSource = DS } } }
|
||||
`
|
||||
p2 := parser.NewParser(contentMissingType)
|
||||
config2, err2 := p2.Parse()
|
||||
if err2 != nil {
|
||||
t.Fatalf("Parse2 failed: %v", err2)
|
||||
}
|
||||
idx2 := index.NewProjectTree()
|
||||
idx2.AddFile("missing_type.marte", config2)
|
||||
idx2.ResolveReferences()
|
||||
v2 := validator.NewValidator(idx2, ".")
|
||||
v2.ValidateProject()
|
||||
|
||||
foundTypeErr := false
|
||||
for _, d := range v2.Diagnostics {
|
||||
if strings.Contains(d.Message, "Implicit signal 'Impl' must define Type") {
|
||||
foundTypeErr = true
|
||||
}
|
||||
}
|
||||
if !foundTypeErr {
|
||||
for _, d := range v2.Diagnostics {
|
||||
t.Logf("Diagnostic2: %s", d.Message)
|
||||
}
|
||||
t.Error("Expected error for missing Type in implicit signal")
|
||||
}
|
||||
}
|
||||
@@ -5,9 +5,9 @@ import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func parseAndAddToIndex(t *testing.T, idx *index.ProjectTree, filePath string) {
|
||||
@@ -32,18 +32,18 @@ func TestMultiFileNodeValidation(t *testing.T) {
|
||||
|
||||
// Resolving references might be needed if the validator relies on it for merging implicitly
|
||||
// But primarily we want to check if the validator sees the merged node.
|
||||
// The current implementation of Validator likely iterates over the ProjectTree.
|
||||
// If the ProjectTree doesn't merge nodes automatically, the Validator needs to do it.
|
||||
// However, the spec says "The build tool, validator, and LSP must merge these definitions".
|
||||
// Let's assume the Validator or Index does the merging logic.
|
||||
|
||||
// The current implementation of Validator likely iterates over the ProjectTree.
|
||||
// If the ProjectTree doesn't merge nodes automatically, the Validator needs to do it.
|
||||
// However, the spec says "The build tool, validator, and LSP must merge these definitions".
|
||||
// Let's assume the Validator or Index does the merging logic.
|
||||
|
||||
v := validator.NewValidator(idx, ".")
|
||||
v.ValidateProject()
|
||||
|
||||
// +MyNode is split.
|
||||
// valid_1 has FieldA
|
||||
// valid_2 has Class and FieldB
|
||||
// If merging works, it should have a Class, so no error about missing Class.
|
||||
// +MyNode is split.
|
||||
// valid_1 has FieldA
|
||||
// valid_2 has Class and FieldB
|
||||
// If merging works, it should have a Class, so no error about missing Class.
|
||||
|
||||
for _, diag := range v.Diagnostics {
|
||||
if strings.Contains(diag.Message, "must contain a 'Class' field") {
|
||||
@@ -79,14 +79,14 @@ func TestMultiFileReference(t *testing.T) {
|
||||
parseAndAddToIndex(t, idx, "integration/multifile_ref_2.marte")
|
||||
|
||||
idx.ResolveReferences()
|
||||
|
||||
// Check if the reference in +SourceNode to TargetNode is resolved.
|
||||
|
||||
// Check if the reference in +SourceNode to TargetNode is resolved.
|
||||
v := validator.NewValidator(idx, ".")
|
||||
v.ValidateProject()
|
||||
|
||||
if len(v.Diagnostics) > 0 {
|
||||
// Filter out irrelevant errors
|
||||
}
|
||||
|
||||
if len(v.Diagnostics) > 0 {
|
||||
// Filter out irrelevant errors
|
||||
}
|
||||
}
|
||||
|
||||
func TestHierarchicalPackageMerge(t *testing.T) {
|
||||
@@ -99,13 +99,13 @@ func TestHierarchicalPackageMerge(t *testing.T) {
|
||||
|
||||
// +MyObj should have Class (from file 1) and FieldX (from file 2).
|
||||
// If Class is missing, ValidateProject reports error.
|
||||
|
||||
|
||||
for _, diag := range v.Diagnostics {
|
||||
if strings.Contains(diag.Message, "must contain a 'Class' field") {
|
||||
t.Errorf("Unexpected 'Class' field error for +MyObj: %s", diag.Message)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
// We can also inspect the tree to verify FieldX is there (optional, but good for confidence)
|
||||
baseNode := idx.Root.Children["Base"]
|
||||
if baseNode == nil {
|
||||
@@ -115,7 +115,7 @@ func TestHierarchicalPackageMerge(t *testing.T) {
|
||||
if objNode == nil {
|
||||
t.Fatal("MyObj node not found in Base")
|
||||
}
|
||||
|
||||
|
||||
hasFieldX := false
|
||||
for _, frag := range objNode.Fragments {
|
||||
for _, def := range frag.Definitions {
|
||||
@@ -124,7 +124,7 @@ func TestHierarchicalPackageMerge(t *testing.T) {
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
if !hasFieldX {
|
||||
t.Error("FieldX not found in +MyObj")
|
||||
}
|
||||
@@ -153,44 +153,44 @@ func TestHierarchicalDuplicate(t *testing.T) {
|
||||
|
||||
func TestIsolatedFileValidation(t *testing.T) {
|
||||
idx := index.NewProjectTree()
|
||||
|
||||
// File 1: Has package. Defines SharedClass.
|
||||
f1Content := `
|
||||
|
||||
// File 1: Has package. Defines SharedClass.
|
||||
f1Content := `
|
||||
#package Proj.Pkg
|
||||
+SharedObj = { Class = SharedClass }
|
||||
`
|
||||
p1 := parser.NewParser(f1Content)
|
||||
c1, _ := p1.Parse()
|
||||
idx.AddFile("shared.marte", c1)
|
||||
|
||||
// File 2: No package. References SharedObj.
|
||||
// Should NOT resolve to SharedObj in shared.marte because iso.marte is isolated.
|
||||
f2Content := `
|
||||
p1 := parser.NewParser(f1Content)
|
||||
c1, _ := p1.Parse()
|
||||
idx.AddFile("shared.marte", c1)
|
||||
|
||||
// File 2: No package. References SharedObj.
|
||||
// Should NOT resolve to SharedObj in shared.marte because iso.marte is isolated.
|
||||
f2Content := `
|
||||
+IsoObj = {
|
||||
Class = "MyClass"
|
||||
Ref = SharedObj
|
||||
}
|
||||
`
|
||||
p2 := parser.NewParser(f2Content)
|
||||
c2, _ := p2.Parse()
|
||||
idx.AddFile("iso.marte", c2)
|
||||
|
||||
idx.ResolveReferences()
|
||||
|
||||
// Find reference
|
||||
var ref *index.Reference
|
||||
for i := range idx.References {
|
||||
if idx.References[i].File == "iso.marte" && idx.References[i].Name == "SharedObj" {
|
||||
ref = &idx.References[i]
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if ref == nil {
|
||||
t.Fatal("Reference SharedObj not found in index")
|
||||
}
|
||||
|
||||
if ref.Target != nil {
|
||||
t.Errorf("Expected reference in isolated file to be unresolved, but got target in %s", ref.Target.Fragments[0].File)
|
||||
}
|
||||
p2 := parser.NewParser(f2Content)
|
||||
c2, _ := p2.Parse()
|
||||
idx.AddFile("iso.marte", c2)
|
||||
|
||||
idx.ResolveReferences()
|
||||
|
||||
// Find reference
|
||||
var ref *index.Reference
|
||||
for i := range idx.References {
|
||||
if idx.References[i].File == "iso.marte" && idx.References[i].Name == "SharedObj" {
|
||||
ref = &idx.References[i]
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if ref == nil {
|
||||
t.Fatal("Reference SharedObj not found in index")
|
||||
}
|
||||
|
||||
if ref.Target != nil {
|
||||
t.Errorf("Expected reference in isolated file to be unresolved, but got target in %s", ref.Target.Fragments[0].File)
|
||||
}
|
||||
}
|
||||
|
||||
69
test/validator_pragma_test.go
Normal file
69
test/validator_pragma_test.go
Normal file
@@ -0,0 +1,69 @@
|
||||
package integration
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func TestPragmaSuppression(t *testing.T) {
|
||||
content := `
|
||||
+Data = {
|
||||
Class = ReferenceContainer
|
||||
+MyDS = {
|
||||
Class = FileReader
|
||||
Filename = "test"
|
||||
Signals = {
|
||||
//!unused: Ignore this
|
||||
UnusedSig = { Type = uint32 }
|
||||
UsedSig = { Type = uint32 }
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
+MyGAM = {
|
||||
Class = IOGAM
|
||||
InputSignals = {
|
||||
UsedSig = { DataSource = MyDS Type = uint32 }
|
||||
|
||||
//!implicit: Ignore this implicit
|
||||
ImplicitSig = { DataSource = MyDS Type = uint32 }
|
||||
}
|
||||
}
|
||||
`
|
||||
p := parser.NewParser(content)
|
||||
config, err := p.Parse()
|
||||
if err != nil {
|
||||
t.Fatalf("Parse failed: %v", err)
|
||||
}
|
||||
|
||||
idx := index.NewProjectTree()
|
||||
idx.AddFile("pragma.marte", config)
|
||||
idx.ResolveReferences()
|
||||
|
||||
v := validator.NewValidator(idx, ".")
|
||||
v.ValidateProject()
|
||||
v.CheckUnused()
|
||||
|
||||
foundUnusedWarning := false
|
||||
foundImplicitWarning := false
|
||||
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "Unused Signal") && strings.Contains(d.Message, "UnusedSig") {
|
||||
foundUnusedWarning = true
|
||||
}
|
||||
if strings.Contains(d.Message, "Implicitly Defined Signal") && strings.Contains(d.Message, "ImplicitSig") {
|
||||
foundImplicitWarning = true
|
||||
}
|
||||
}
|
||||
|
||||
if foundUnusedWarning {
|
||||
t.Error("Expected warning for UnusedSig to be suppressed")
|
||||
}
|
||||
if foundImplicitWarning {
|
||||
t.Error("Expected warning for ImplicitSig to be suppressed")
|
||||
}
|
||||
}
|
||||
@@ -6,9 +6,9 @@ import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func TestProjectSpecificSchema(t *testing.T) {
|
||||
@@ -21,17 +21,16 @@ func TestProjectSpecificSchema(t *testing.T) {
|
||||
|
||||
// Define project schema
|
||||
schemaContent := `
|
||||
{
|
||||
"classes": {
|
||||
"ProjectClass": {
|
||||
"fields": [
|
||||
{"name": "CustomField", "type": "int", "mandatory": true}
|
||||
]
|
||||
}
|
||||
}
|
||||
package schema
|
||||
|
||||
#Classes: {
|
||||
ProjectClass: {
|
||||
CustomField: int
|
||||
...
|
||||
}
|
||||
}
|
||||
`
|
||||
err = os.WriteFile(filepath.Join(tmpDir, ".marte_schema.json"), []byte(schemaContent), 0644)
|
||||
err = os.WriteFile(filepath.Join(tmpDir, ".marte_schema.cue"), []byte(schemaContent), 0644)
|
||||
if err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
@@ -59,7 +58,7 @@ func TestProjectSpecificSchema(t *testing.T) {
|
||||
|
||||
found := false
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "Missing mandatory field 'CustomField'") {
|
||||
if strings.Contains(d.Message, "CustomField: incomplete value") {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
|
||||
@@ -4,44 +4,11 @@ import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func TestSchemaValidationMandatory(t *testing.T) {
|
||||
// StateMachine requires "States"
|
||||
content := `
|
||||
+MySM = {
|
||||
Class = StateMachine
|
||||
// Missing States
|
||||
}
|
||||
`
|
||||
p := parser.NewParser(content)
|
||||
config, err := p.Parse()
|
||||
if err != nil {
|
||||
t.Fatalf("Parse failed: %v", err)
|
||||
}
|
||||
|
||||
idx := index.NewProjectTree()
|
||||
idx.AddFile("test.marte", config)
|
||||
|
||||
v := validator.NewValidator(idx, ".")
|
||||
v.ValidateProject()
|
||||
|
||||
found := false
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "Missing mandatory field 'States'") {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if !found {
|
||||
t.Error("Expected error for missing mandatory field 'States', but found none")
|
||||
}
|
||||
}
|
||||
|
||||
func TestSchemaValidationType(t *testing.T) {
|
||||
// OrderedClass: First (int), Second (string)
|
||||
content := `
|
||||
@@ -65,7 +32,7 @@ func TestSchemaValidationType(t *testing.T) {
|
||||
|
||||
found := false
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "Field 'First' expects type 'int'") {
|
||||
if strings.Contains(d.Message, "mismatched types") {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
@@ -105,8 +72,8 @@ func TestSchemaValidationOrder(t *testing.T) {
|
||||
}
|
||||
}
|
||||
|
||||
if !found {
|
||||
t.Error("Expected error for out-of-order fields, but found none")
|
||||
if found {
|
||||
t.Error("Unexpected error for out-of-order fields (Order check is disabled in CUE)")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
108
test/validator_signal_properties_test.go
Normal file
108
test/validator_signal_properties_test.go
Normal file
@@ -0,0 +1,108 @@
|
||||
package integration
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func TestSignalProperties(t *testing.T) {
|
||||
content := `
|
||||
+Data = {
|
||||
Class = ReferenceContainer
|
||||
+MyDS = {
|
||||
Class = FileReader
|
||||
Filename = "test"
|
||||
Signals = {
|
||||
Correct = { Type = uint32 NumberOfElements = 10 }
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
+MyGAM = {
|
||||
Class = IOGAM
|
||||
InputSignals = {
|
||||
// Correct reference
|
||||
Correct = { DataSource = MyDS Type = uint32 NumberOfElements = 10 }
|
||||
|
||||
// Mismatch Type
|
||||
BadType = {
|
||||
Alias = Correct
|
||||
DataSource = MyDS
|
||||
Type = float32 // Error
|
||||
}
|
||||
|
||||
// Mismatch Elements
|
||||
BadElements = {
|
||||
Alias = Correct
|
||||
DataSource = MyDS
|
||||
Type = uint32
|
||||
NumberOfElements = 20 // Error
|
||||
}
|
||||
|
||||
// Valid Cast
|
||||
//!cast(uint32, float32): Cast reason
|
||||
CastSig = {
|
||||
Alias = Correct
|
||||
DataSource = MyDS
|
||||
Type = float32 // OK
|
||||
}
|
||||
|
||||
// Invalid Cast (Wrong definition type in pragma)
|
||||
//!cast(int32, float32): Wrong def type
|
||||
BadCast = {
|
||||
Alias = Correct
|
||||
DataSource = MyDS
|
||||
Type = float32 // Error because pragma mismatch
|
||||
}
|
||||
}
|
||||
}
|
||||
`
|
||||
p := parser.NewParser(content)
|
||||
config, err := p.Parse()
|
||||
if err != nil {
|
||||
t.Fatalf("Parse failed: %v", err)
|
||||
}
|
||||
|
||||
idx := index.NewProjectTree()
|
||||
idx.AddFile("signal_props.marte", config)
|
||||
idx.ResolveReferences()
|
||||
|
||||
v := validator.NewValidator(idx, ".")
|
||||
v.ValidateProject()
|
||||
|
||||
foundBadType := false
|
||||
foundBadElements := false
|
||||
foundBadCast := false
|
||||
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "property 'Type' mismatch") {
|
||||
if strings.Contains(d.Message, "'BadType'") {
|
||||
foundBadType = true
|
||||
}
|
||||
if strings.Contains(d.Message, "'BadCast'") {
|
||||
foundBadCast = true
|
||||
}
|
||||
if strings.Contains(d.Message, "'CastSig'") {
|
||||
t.Error("Unexpected error for CastSig (should be suppressed by pragma)")
|
||||
}
|
||||
}
|
||||
|
||||
if strings.Contains(d.Message, "property 'NumberOfElements' mismatch") {
|
||||
foundBadElements = true
|
||||
}
|
||||
}
|
||||
|
||||
if !foundBadType {
|
||||
t.Error("Expected error for BadType")
|
||||
}
|
||||
if !foundBadElements {
|
||||
t.Error("Expected error for BadElements")
|
||||
}
|
||||
if !foundBadCast {
|
||||
t.Error("Expected error for BadCast (pragma mismatch)")
|
||||
}
|
||||
}
|
||||
73
test/validator_signal_test.go
Normal file
73
test/validator_signal_test.go
Normal file
@@ -0,0 +1,73 @@
|
||||
package integration
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func TestSignalValidation(t *testing.T) {
|
||||
content := `
|
||||
+Data = {
|
||||
Class = ReferenceContainer
|
||||
+ValidDS = {
|
||||
Class = DataSource
|
||||
Signals = {
|
||||
ValidSig = {
|
||||
Type = uint32
|
||||
}
|
||||
}
|
||||
}
|
||||
+MissingTypeDS = {
|
||||
Class = DataSource
|
||||
Signals = {
|
||||
InvalidSig = {
|
||||
// Missing Type
|
||||
Dummy = 1
|
||||
}
|
||||
}
|
||||
}
|
||||
+InvalidTypeDS = {
|
||||
Class = DataSource
|
||||
Signals = {
|
||||
InvalidSig = {
|
||||
Type = invalid_type
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
`
|
||||
p := parser.NewParser(content)
|
||||
config, err := p.Parse()
|
||||
if err != nil {
|
||||
t.Fatalf("Parse failed: %v", err)
|
||||
}
|
||||
|
||||
idx := index.NewProjectTree()
|
||||
idx.AddFile("signal_test.marte", config)
|
||||
|
||||
v := validator.NewValidator(idx, ".")
|
||||
v.ValidateProject()
|
||||
|
||||
foundMissing := false
|
||||
foundInvalid := false
|
||||
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "missing mandatory field 'Type'") {
|
||||
foundMissing = true
|
||||
}
|
||||
if strings.Contains(d.Message, "Invalid Type 'invalid_type'") {
|
||||
foundInvalid = true
|
||||
}
|
||||
}
|
||||
|
||||
if !foundMissing {
|
||||
t.Error("Expected error for missing Type field in Signal")
|
||||
}
|
||||
if !foundInvalid {
|
||||
t.Error("Expected error for invalid Type value in Signal")
|
||||
}
|
||||
}
|
||||
59
test/validator_signals_content_test.go
Normal file
59
test/validator_signals_content_test.go
Normal file
@@ -0,0 +1,59 @@
|
||||
package integration
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func TestSignalsContentValidation(t *testing.T) {
|
||||
content := `
|
||||
+Data = {
|
||||
Class = ReferenceContainer
|
||||
+BadDS = {
|
||||
Class = DataSource
|
||||
Signals = {
|
||||
BadField = 1
|
||||
BadArray = { 1 2 }
|
||||
// Valid signal
|
||||
ValidSig = {
|
||||
Type = uint32
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
`
|
||||
p := parser.NewParser(content)
|
||||
config, err := p.Parse()
|
||||
if err != nil {
|
||||
t.Fatalf("Parse failed: %v", err)
|
||||
}
|
||||
|
||||
idx := index.NewProjectTree()
|
||||
idx.AddFile("signals_content.marte", config)
|
||||
|
||||
v := validator.NewValidator(idx, ".")
|
||||
v.ValidateProject()
|
||||
|
||||
foundBadField := false
|
||||
foundBadArray := false
|
||||
|
||||
for _, d := range v.Diagnostics {
|
||||
if strings.Contains(d.Message, "Field 'BadField' is not allowed") {
|
||||
foundBadField = true
|
||||
}
|
||||
if strings.Contains(d.Message, "Field 'BadArray' is not allowed") {
|
||||
foundBadArray = true
|
||||
}
|
||||
}
|
||||
|
||||
if !foundBadField {
|
||||
t.Error("Expected error for BadField in Signals")
|
||||
}
|
||||
if !foundBadArray {
|
||||
t.Error("Expected error for BadArray in Signals")
|
||||
}
|
||||
}
|
||||
@@ -3,9 +3,9 @@ package integration
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
||||
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||
)
|
||||
|
||||
func TestUnusedGAM(t *testing.T) {
|
||||
@@ -63,8 +63,10 @@ $App = {
|
||||
$Data = {
|
||||
+MyDS = {
|
||||
Class = DataSourceClass
|
||||
Sig1 = { Type = uint32 }
|
||||
Sig2 = { Type = uint32 }
|
||||
+Signals = {
|
||||
Sig1 = { Type = uint32 }
|
||||
Sig2 = { Type = uint32 }
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Reference in New Issue
Block a user