Compare commits
20 Commits
1ea518a58a
...
0.1.0
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
0ffcecf19e | ||
|
|
761cf83b8e | ||
|
|
7caf3a5da5 | ||
|
|
94ee7e4880 | ||
|
|
ce9b68200e | ||
|
|
e3c84fcf60 | ||
|
|
4a515fd6c3 | ||
|
|
14cba1b530 | ||
|
|
462c832651 | ||
|
|
77fe3e9cac | ||
|
|
0ee44c0a27 | ||
|
|
d450d358b4 | ||
|
|
2cdcfe2812 | ||
|
|
ef7729475a | ||
|
|
99bd5bffdd | ||
|
|
4379960835 | ||
|
|
2aeec1e5f6 | ||
|
|
5853365707 | ||
|
|
5c3f05a1a4 | ||
|
|
e2c87c90f3 |
2
.gitignore
vendored
2
.gitignore
vendored
@@ -1,2 +1,4 @@
|
|||||||
build
|
build
|
||||||
*.log
|
*.log
|
||||||
|
mdt
|
||||||
|
*.out
|
||||||
|
|||||||
21
LICENSE
Normal file
21
LICENSE
Normal file
@@ -0,0 +1,21 @@
|
|||||||
|
MIT License
|
||||||
|
|
||||||
|
Copyright (c) 2026 MARTe Community
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
of this software and associated documentation files (the "Software"), to deal
|
||||||
|
in the Software without restriction, including without limitation the rights
|
||||||
|
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
copies of the Software, and to permit persons to whom the Software is
|
||||||
|
furnished to do so, subject to the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be included in all
|
||||||
|
copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||||
|
SOFTWARE.
|
||||||
24
Makefile
Normal file
24
Makefile
Normal file
@@ -0,0 +1,24 @@
|
|||||||
|
BINARY_NAME=mdt
|
||||||
|
BUILD_DIR=build
|
||||||
|
|
||||||
|
.PHONY: all build test coverage clean install
|
||||||
|
|
||||||
|
all: test build
|
||||||
|
|
||||||
|
build:
|
||||||
|
mkdir -p $(BUILD_DIR)
|
||||||
|
go build -o $(BUILD_DIR)/$(BINARY_NAME) ./cmd/mdt
|
||||||
|
|
||||||
|
test:
|
||||||
|
go test -v ./...
|
||||||
|
|
||||||
|
coverage:
|
||||||
|
go test -cover -coverprofile=coverage.out ./test/... -coverpkg=./internal/...
|
||||||
|
go tool cover -func=coverage.out
|
||||||
|
|
||||||
|
clean:
|
||||||
|
rm -rf $(BUILD_DIR)
|
||||||
|
rm -f coverage.out
|
||||||
|
|
||||||
|
install:
|
||||||
|
go install ./cmd/mdt
|
||||||
96
README.md
Normal file
96
README.md
Normal file
@@ -0,0 +1,96 @@
|
|||||||
|
# MARTe Development Tools (mdt)
|
||||||
|
|
||||||
|
`mdt` is a comprehensive toolkit for developing, validating, and building configurations for the MARTe real-time framework. It provides a CLI and a Language Server Protocol (LSP) server to enhance the development experience.
|
||||||
|
|
||||||
|
## Features
|
||||||
|
|
||||||
|
- **LSP Server**: Real-time syntax checking, validation, autocomplete, hover documentation, and navigation (Go to Definition/References).
|
||||||
|
- **Builder**: Merges multiple configuration files into a single, ordered output file.
|
||||||
|
- **Formatter**: Standardizes configuration file formatting.
|
||||||
|
- **Validator**: Advanced semantic validation using [CUE](https://cuelang.org/) schemas, ensuring type safety and structural correctness.
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
|
||||||
|
### From Source
|
||||||
|
|
||||||
|
Requirements: Go 1.21+
|
||||||
|
|
||||||
|
```bash
|
||||||
|
go install github.com/marte-community/marte-dev-tools/cmd/mdt@latest
|
||||||
|
```
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
### CLI Commands
|
||||||
|
|
||||||
|
- **Check**: Run validation on a file or project.
|
||||||
|
```bash
|
||||||
|
mdt check path/to/project
|
||||||
|
```
|
||||||
|
- **Build**: Merge project files into a single output.
|
||||||
|
```bash
|
||||||
|
mdt build -o output.marte main.marte
|
||||||
|
```
|
||||||
|
- **Format**: Format configuration files.
|
||||||
|
```bash
|
||||||
|
mdt fmt path/to/file.marte
|
||||||
|
```
|
||||||
|
- **LSP**: Start the language server (used by editor plugins).
|
||||||
|
```bash
|
||||||
|
mdt lsp
|
||||||
|
```
|
||||||
|
|
||||||
|
### Editor Integration
|
||||||
|
|
||||||
|
`mdt lsp` implements the Language Server Protocol. You can use it with any LSP-compatible editor (VS Code, Neovim, Emacs, etc.).
|
||||||
|
|
||||||
|
## MARTe Configuration
|
||||||
|
|
||||||
|
The tools support the MARTe configuration format with extended features:
|
||||||
|
- **Objects**: `+Node = { Class = ... }`
|
||||||
|
- **Signals**: `Signal = { Type = ... }`
|
||||||
|
- **Namespaces**: `#package PROJECT.NODE` for organizing multi-file projects.
|
||||||
|
|
||||||
|
### Validation & Schema
|
||||||
|
|
||||||
|
Validation is fully schema-driven using CUE.
|
||||||
|
|
||||||
|
- **Built-in Schema**: Covers standard MARTe classes (`StateMachine`, `GAM`, `DataSource`, `RealTimeApplication`, etc.).
|
||||||
|
- **Custom Schema**: Add a `.marte_schema.cue` file to your project root to extend or override definitions.
|
||||||
|
|
||||||
|
**Example `.marte_schema.cue`:**
|
||||||
|
```cue
|
||||||
|
package schema
|
||||||
|
|
||||||
|
#Classes: {
|
||||||
|
MyCustomGAM: {
|
||||||
|
Param1: int
|
||||||
|
Param2?: string
|
||||||
|
...
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Pragmas (Suppressing Warnings)
|
||||||
|
|
||||||
|
Use comments starting with `//!` to control validation behavior:
|
||||||
|
|
||||||
|
- `//!unused: Reason` - Suppress "Unused GAM" or "Unused Signal" warnings.
|
||||||
|
- `//!implicit: Reason` - Suppress "Implicitly Defined Signal" warnings.
|
||||||
|
- `//!cast(DefinedType, UsageType)` - Allow type mismatch between definition and usage (e.g. `//!cast(uint32, int32)`).
|
||||||
|
- `//!allow(unused)` - Global suppression for the file.
|
||||||
|
|
||||||
|
## Development
|
||||||
|
|
||||||
|
### Building
|
||||||
|
```bash
|
||||||
|
go build ./cmd/mdt
|
||||||
|
```
|
||||||
|
|
||||||
|
### Running Tests
|
||||||
|
```bash
|
||||||
|
go test ./...
|
||||||
|
```
|
||||||
|
|
||||||
|
## License
|
||||||
|
MIT
|
||||||
@@ -4,13 +4,13 @@ import (
|
|||||||
"bytes"
|
"bytes"
|
||||||
"os"
|
"os"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/builder"
|
"github.com/marte-community/marte-dev-tools/internal/builder"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/formatter"
|
"github.com/marte-community/marte-dev-tools/internal/formatter"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/logger"
|
"github.com/marte-community/marte-dev-tools/internal/logger"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/lsp"
|
"github.com/marte-community/marte-dev-tools/internal/lsp"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
func main() {
|
func main() {
|
||||||
|
|||||||
@@ -765,6 +765,7 @@ $TbTestApp = {
|
|||||||
DataSource = DDB1
|
DataSource = DDB1
|
||||||
Type = uint32
|
Type = uint32
|
||||||
}
|
}
|
||||||
|
//!implicit: defined here as I am lazy
|
||||||
Time_DDB1 = {
|
Time_DDB1 = {
|
||||||
DataSource = DDB1
|
DataSource = DDB1
|
||||||
Type = uint32
|
Type = uint32
|
||||||
|
|||||||
17
go.mod
17
go.mod
@@ -1,3 +1,18 @@
|
|||||||
module github.com/marte-dev/marte-dev-tools
|
module github.com/marte-community/marte-dev-tools
|
||||||
|
|
||||||
go 1.25.6
|
go 1.25.6
|
||||||
|
|
||||||
|
require cuelang.org/go v0.15.3
|
||||||
|
|
||||||
|
require (
|
||||||
|
github.com/cockroachdb/apd/v3 v3.2.1 // indirect
|
||||||
|
github.com/emicklei/proto v1.14.2 // indirect
|
||||||
|
github.com/google/uuid v1.6.0 // indirect
|
||||||
|
github.com/mitchellh/go-wordwrap v1.0.1 // indirect
|
||||||
|
github.com/pelletier/go-toml/v2 v2.2.4 // indirect
|
||||||
|
github.com/protocolbuffers/txtpbfmt v0.0.0-20251016062345-16587c79cd91 // indirect
|
||||||
|
go.yaml.in/yaml/v3 v3.0.4 // indirect
|
||||||
|
golang.org/x/net v0.46.0 // indirect
|
||||||
|
golang.org/x/text v0.30.0 // indirect
|
||||||
|
google.golang.org/protobuf v1.33.0 // indirect
|
||||||
|
)
|
||||||
|
|||||||
53
go.sum
Normal file
53
go.sum
Normal file
@@ -0,0 +1,53 @@
|
|||||||
|
cuelabs.dev/go/oci/ociregistry v0.0.0-20250722084951-074d06050084 h1:4k1yAtPvZJZQTu8DRY8muBo0LHv6TqtrE0AO5n6IPYs=
|
||||||
|
cuelabs.dev/go/oci/ociregistry v0.0.0-20250722084951-074d06050084/go.mod h1:4WWeZNxUO1vRoZWAHIG0KZOd6dA25ypyWuwD3ti0Tdc=
|
||||||
|
cuelang.org/go v0.15.3 h1:JKR/lZVwuIGlLTGIaJ0jONz9+CK3UDx06sQ6DDxNkaE=
|
||||||
|
cuelang.org/go v0.15.3/go.mod h1:NYw6n4akZcTjA7QQwJ1/gqWrrhsN4aZwhcAL0jv9rZE=
|
||||||
|
github.com/cockroachdb/apd/v3 v3.2.1 h1:U+8j7t0axsIgvQUqthuNm82HIrYXodOV2iWLWtEaIwg=
|
||||||
|
github.com/cockroachdb/apd/v3 v3.2.1/go.mod h1:klXJcjp+FffLTHlhIG69tezTDvdP065naDsHzKhYSqc=
|
||||||
|
github.com/emicklei/proto v1.14.2 h1:wJPxPy2Xifja9cEMrcA/g08art5+7CGJNFNk35iXC1I=
|
||||||
|
github.com/emicklei/proto v1.14.2/go.mod h1:rn1FgRS/FANiZdD2djyH7TMA9jdRDcYQ9IEN9yvjX0A=
|
||||||
|
github.com/go-quicktest/qt v1.101.0 h1:O1K29Txy5P2OK0dGo59b7b0LR6wKfIhttaAhHUyn7eI=
|
||||||
|
github.com/go-quicktest/qt v1.101.0/go.mod h1:14Bz/f7NwaXPtdYEgzsx46kqSxVwTbzVZsDC26tQJow=
|
||||||
|
github.com/google/go-cmp v0.7.0 h1:wk8382ETsv4JYUZwIsn6YpYiWiBsYLSJiTsyBybVuN8=
|
||||||
|
github.com/google/go-cmp v0.7.0/go.mod h1:pXiqmnSA92OHEEa9HXL2W4E7lf9JzCmGVUdgjX3N/iU=
|
||||||
|
github.com/google/uuid v1.6.0 h1:NIvaJDMOsjHA8n1jAhLSgzrAzy1Hgr+hNrb57e+94F0=
|
||||||
|
github.com/google/uuid v1.6.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=
|
||||||
|
github.com/kr/pretty v0.3.1 h1:flRD4NNwYAUpkphVc1HcthR4KEIFJ65n8Mw5qdRn3LE=
|
||||||
|
github.com/kr/pretty v0.3.1/go.mod h1:hoEshYVHaxMs3cyo3Yncou5ZscifuDolrwPKZanG3xk=
|
||||||
|
github.com/kr/text v0.2.0 h1:5Nx0Ya0ZqY2ygV366QzturHI13Jq95ApcVaJBhpS+AY=
|
||||||
|
github.com/kr/text v0.2.0/go.mod h1:eLer722TekiGuMkidMxC/pM04lWEeraHUUmBw8l2grE=
|
||||||
|
github.com/kylelemons/godebug v1.1.0 h1:RPNrshWIDI6G2gRW9EHilWtl7Z6Sb1BR0xunSBf0SNc=
|
||||||
|
github.com/kylelemons/godebug v1.1.0/go.mod h1:9/0rRGxNHcop5bhtWyNeEfOS8JIWk580+fNqagV/RAw=
|
||||||
|
github.com/lib/pq v1.10.7 h1:p7ZhMD+KsSRozJr34udlUrhboJwWAgCg34+/ZZNvZZw=
|
||||||
|
github.com/lib/pq v1.10.7/go.mod h1:AlVN5x4E4T544tWzH6hKfbfQvm3HdbOxrmggDNAPY9o=
|
||||||
|
github.com/mitchellh/go-wordwrap v1.0.1 h1:TLuKupo69TCn6TQSyGxwI1EblZZEsQ0vMlAFQflz0v0=
|
||||||
|
github.com/mitchellh/go-wordwrap v1.0.1/go.mod h1:R62XHJLzvMFRBbcrT7m7WgmE1eOyTSsCt+hzestvNj0=
|
||||||
|
github.com/opencontainers/go-digest v1.0.0 h1:apOUWs51W5PlhuyGyz9FCeeBIOUDA/6nW8Oi/yOhh5U=
|
||||||
|
github.com/opencontainers/go-digest v1.0.0/go.mod h1:0JzlMkj0TRzQZfJkVvzbP0HBR3IKzErnv2BNG4W4MAM=
|
||||||
|
github.com/opencontainers/image-spec v1.1.1 h1:y0fUlFfIZhPF1W537XOLg0/fcx6zcHCJwooC2xJA040=
|
||||||
|
github.com/opencontainers/image-spec v1.1.1/go.mod h1:qpqAh3Dmcf36wStyyWU+kCeDgrGnAve2nCC8+7h8Q0M=
|
||||||
|
github.com/pelletier/go-toml/v2 v2.2.4 h1:mye9XuhQ6gvn5h28+VilKrrPoQVanw5PMw/TB0t5Ec4=
|
||||||
|
github.com/pelletier/go-toml/v2 v2.2.4/go.mod h1:2gIqNv+qfxSVS7cM2xJQKtLSTLUE9V8t9Stt+h56mCY=
|
||||||
|
github.com/protocolbuffers/txtpbfmt v0.0.0-20251016062345-16587c79cd91 h1:s1LvMaU6mVwoFtbxv/rCZKE7/fwDmDY684FfUe4c1Io=
|
||||||
|
github.com/protocolbuffers/txtpbfmt v0.0.0-20251016062345-16587c79cd91/go.mod h1:JSbkp0BviKovYYt9XunS95M3mLPibE9bGg+Y95DsEEY=
|
||||||
|
github.com/rogpeppe/go-internal v1.14.1 h1:UQB4HGPB6osV0SQTLymcB4TgvyWu6ZyliaW0tI/otEQ=
|
||||||
|
github.com/rogpeppe/go-internal v1.14.1/go.mod h1:MaRKkUm5W0goXpeCfT7UZI6fk/L7L7so1lCWt35ZSgc=
|
||||||
|
go.yaml.in/yaml/v3 v3.0.4 h1:tfq32ie2Jv2UxXFdLJdh3jXuOzWiL1fo0bu/FbuKpbc=
|
||||||
|
go.yaml.in/yaml/v3 v3.0.4/go.mod h1:DhzuOOF2ATzADvBadXxruRBLzYTpT36CKvDb3+aBEFg=
|
||||||
|
golang.org/x/mod v0.29.0 h1:HV8lRxZC4l2cr3Zq1LvtOsi/ThTgWnUk/y64QSs8GwA=
|
||||||
|
golang.org/x/mod v0.29.0/go.mod h1:NyhrlYXJ2H4eJiRy/WDBO6HMqZQ6q9nk4JzS3NuCK+w=
|
||||||
|
golang.org/x/net v0.46.0 h1:giFlY12I07fugqwPuWJi68oOnpfqFnJIJzaIIm2JVV4=
|
||||||
|
golang.org/x/net v0.46.0/go.mod h1:Q9BGdFy1y4nkUwiLvT5qtyhAnEHgnQ/zd8PfU6nc210=
|
||||||
|
golang.org/x/oauth2 v0.32.0 h1:jsCblLleRMDrxMN29H3z/k1KliIvpLgCkE6R8FXXNgY=
|
||||||
|
golang.org/x/oauth2 v0.32.0/go.mod h1:lzm5WQJQwKZ3nwavOZ3IS5Aulzxi68dUSgRHujetwEA=
|
||||||
|
golang.org/x/sync v0.17.0 h1:l60nONMj9l5drqw6jlhIELNv9I0A4OFgRsG9k2oT9Ug=
|
||||||
|
golang.org/x/sync v0.17.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
|
||||||
|
golang.org/x/text v0.30.0 h1:yznKA/E9zq54KzlzBEAWn1NXSQ8DIp/NYMy88xJjl4k=
|
||||||
|
golang.org/x/text v0.30.0/go.mod h1:yDdHFIX9t+tORqspjENWgzaCVXgk0yYnYuSZ8UzzBVM=
|
||||||
|
golang.org/x/tools v0.38.0 h1:Hx2Xv8hISq8Lm16jvBZ2VQf+RLmbd7wVUsALibYI/IQ=
|
||||||
|
golang.org/x/tools v0.38.0/go.mod h1:yEsQ/d/YK8cjh0L6rZlY8tgtlKiBNTL14pGDJPJpYQs=
|
||||||
|
google.golang.org/protobuf v1.33.0 h1:uNO2rsAINq/JlFpSdYEKIZ0uKD/R9cpdv0T+yoGwGmI=
|
||||||
|
google.golang.org/protobuf v1.33.0/go.mod h1:c6P6GXX6sHbq/GpV6MGZEdwhWPcYBgnhAHhKbcUYpos=
|
||||||
|
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
|
||||||
|
gopkg.in/check.v1 v1.0.0-20180628173108-788fd7840127 h1:qIbj1fsPNlZgppZ+VLlY7N33q108Sa+fhmuc+sWQYwY=
|
||||||
|
gopkg.in/check.v1 v1.0.0-20180628173108-788fd7840127/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
|
||||||
@@ -6,8 +6,8 @@ import (
|
|||||||
"sort"
|
"sort"
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
)
|
)
|
||||||
|
|
||||||
type Builder struct {
|
type Builder struct {
|
||||||
@@ -71,86 +71,38 @@ func (b *Builder) writeNodeContent(f *os.File, node *index.ProjectNode, indent i
|
|||||||
indentStr := strings.Repeat(" ", indent)
|
indentStr := strings.Repeat(" ", indent)
|
||||||
|
|
||||||
// If this node has a RealName (e.g. +App), we print it as an object definition
|
// If this node has a RealName (e.g. +App), we print it as an object definition
|
||||||
// UNLESS it is the top-level output file itself?
|
|
||||||
// If we are writing "App.marte", maybe we are writing the *body* of App?
|
|
||||||
// Spec: "unifying multi-file project into a single configuration output"
|
|
||||||
|
|
||||||
// Let's assume we print the Node itself.
|
|
||||||
if node.RealName != "" {
|
if node.RealName != "" {
|
||||||
fmt.Fprintf(f, "%s%s = {\n", indentStr, node.RealName)
|
fmt.Fprintf(f, "%s%s = {\n", indentStr, node.RealName)
|
||||||
indent++
|
indent++
|
||||||
indentStr = strings.Repeat(" ", indent)
|
indentStr = strings.Repeat(" ", indent)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
writtenChildren := make(map[string]bool)
|
||||||
|
|
||||||
// 2. Write definitions from fragments
|
// 2. Write definitions from fragments
|
||||||
for _, frag := range node.Fragments {
|
for _, frag := range node.Fragments {
|
||||||
// Use formatter logic to print definitions
|
|
||||||
// We need a temporary Config to use Formatter?
|
|
||||||
// Or just reimplement basic printing? Formatter is better.
|
|
||||||
// But Formatter prints to io.Writer.
|
|
||||||
|
|
||||||
// We can reuse formatDefinition logic if we exposed it, or just copy basic logic.
|
|
||||||
// Since we need to respect indentation, using Formatter.Format might be tricky
|
|
||||||
// unless we wrap definitions in a dummy structure.
|
|
||||||
|
|
||||||
for _, def := range frag.Definitions {
|
for _, def := range frag.Definitions {
|
||||||
// Basic formatting for now, referencing formatter style
|
switch d := def.(type) {
|
||||||
b.writeDefinition(f, def, indent)
|
case *parser.Field:
|
||||||
|
b.writeDefinition(f, d, indent)
|
||||||
|
case *parser.ObjectNode:
|
||||||
|
norm := index.NormalizeName(d.Name)
|
||||||
|
if child, ok := node.Children[norm]; ok {
|
||||||
|
if !writtenChildren[norm] {
|
||||||
|
b.writeNodeContent(f, child, indent)
|
||||||
|
writtenChildren[norm] = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// 3. Write Children (recursively)
|
// 3. Write Children (recursively)
|
||||||
// Children are sub-nodes defined implicitly via #package A.B or explicitly +Sub
|
|
||||||
// Explicit +Sub are handled via Fragments logic (they are definitions in fragments).
|
|
||||||
// Implicit nodes (from #package A.B.C where B was never explicitly defined)
|
|
||||||
// show up in Children map but maybe not in Fragments?
|
|
||||||
|
|
||||||
// If a Child is NOT in fragments (implicit), we still need to write it.
|
|
||||||
// If it IS in fragments (explicit +Child), it was handled in loop above?
|
|
||||||
// Wait. My Indexer puts `+Sub` into `node.Children["Sub"]` AND adds a `Fragment` to `node` containing `+Sub` object?
|
|
||||||
|
|
||||||
// Let's check Indexer.
|
|
||||||
// Case ObjectNode:
|
|
||||||
// Adds Fragment to `child` (the Sub node).
|
|
||||||
// Does NOT add `ObjectNode` definition to `node`'s fragment list?
|
|
||||||
// "pt.addObjectFragment(child...)"
|
|
||||||
// It does NOT add to `fileFragment.Definitions`.
|
|
||||||
|
|
||||||
// So `node.Fragments` only contains Fields!
|
|
||||||
// Children are all in `node.Children`.
|
|
||||||
|
|
||||||
// So:
|
|
||||||
// 1. Write Fields (from Fragments).
|
|
||||||
// 2. Write Children (from Children map).
|
|
||||||
|
|
||||||
// But wait, Fragments might have order?
|
|
||||||
// "Relative ordering within a file is preserved."
|
|
||||||
// My Indexer splits Fields and Objects.
|
|
||||||
// Fields go to Fragments. Objects go to Children.
|
|
||||||
// This loses the relative order between Fields and Objects in the source file!
|
|
||||||
|
|
||||||
// Correct Indexer approach for preserving order:
|
|
||||||
// `Fragment` should contain a list of `Entry`.
|
|
||||||
// `Entry` can be `Field` OR `ChildNodeName`.
|
|
||||||
|
|
||||||
// But I just rewrote Indexer to split them.
|
|
||||||
// If strict order is required "within a file", my Indexer is slightly lossy regarding Field vs Object order.
|
|
||||||
// Spec: "Relative ordering within a file is preserved."
|
|
||||||
|
|
||||||
// To fix this without another full rewrite:
|
|
||||||
// Iterating `node.Children` alphabetically is arbitrary.
|
|
||||||
// We should ideally iterate them in the order they appear.
|
|
||||||
|
|
||||||
// For now, I will proceed with writing Children after Fields, which is a common convention,
|
|
||||||
// unless strict interleaving is required.
|
|
||||||
// Given "Class first" rule, reordering happens anyway.
|
|
||||||
|
|
||||||
// Sorting Children?
|
|
||||||
// Maybe keep a list of OrderedChildren in ProjectNode?
|
|
||||||
|
|
||||||
sortedChildren := make([]string, 0, len(node.Children))
|
sortedChildren := make([]string, 0, len(node.Children))
|
||||||
for k := range node.Children {
|
for k := range node.Children {
|
||||||
sortedChildren = append(sortedChildren, k)
|
if !writtenChildren[k] {
|
||||||
|
sortedChildren = append(sortedChildren, k)
|
||||||
|
}
|
||||||
}
|
}
|
||||||
sort.Strings(sortedChildren) // Alphabetical for determinism
|
sort.Strings(sortedChildren) // Alphabetical for determinism
|
||||||
|
|
||||||
|
|||||||
@@ -6,7 +6,7 @@ import (
|
|||||||
"sort"
|
"sort"
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
)
|
)
|
||||||
|
|
||||||
type Insertable struct {
|
type Insertable struct {
|
||||||
|
|||||||
@@ -5,8 +5,8 @@ import (
|
|||||||
"path/filepath"
|
"path/filepath"
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/logger"
|
"github.com/marte-community/marte-dev-tools/internal/logger"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
)
|
)
|
||||||
|
|
||||||
type ProjectTree struct {
|
type ProjectTree struct {
|
||||||
@@ -222,6 +222,7 @@ func (pt *ProjectTree) populateNode(node *ProjectNode, file string, config *pars
|
|||||||
fileFragment.Definitions = append(fileFragment.Definitions, d)
|
fileFragment.Definitions = append(fileFragment.Definitions, d)
|
||||||
pt.indexValue(file, d.Value)
|
pt.indexValue(file, d.Value)
|
||||||
case *parser.ObjectNode:
|
case *parser.ObjectNode:
|
||||||
|
fileFragment.Definitions = append(fileFragment.Definitions, d)
|
||||||
norm := NormalizeName(d.Name)
|
norm := NormalizeName(d.Name)
|
||||||
if _, ok := node.Children[norm]; !ok {
|
if _, ok := node.Children[norm]; !ok {
|
||||||
node.Children[norm] = &ProjectNode{
|
node.Children[norm] = &ProjectNode{
|
||||||
@@ -276,6 +277,7 @@ func (pt *ProjectTree) addObjectFragment(node *ProjectNode, file string, obj *pa
|
|||||||
pt.indexValue(file, d.Value)
|
pt.indexValue(file, d.Value)
|
||||||
pt.extractFieldMetadata(node, d)
|
pt.extractFieldMetadata(node, d)
|
||||||
case *parser.ObjectNode:
|
case *parser.ObjectNode:
|
||||||
|
frag.Definitions = append(frag.Definitions, d)
|
||||||
norm := NormalizeName(d.Name)
|
norm := NormalizeName(d.Name)
|
||||||
if _, ok := node.Children[norm]; !ok {
|
if _, ok := node.Children[norm]; !ok {
|
||||||
node.Children[norm] = &ProjectNode{
|
node.Children[norm] = &ProjectNode{
|
||||||
@@ -390,25 +392,65 @@ func (pt *ProjectTree) ResolveReferences() {
|
|||||||
for i := range pt.References {
|
for i := range pt.References {
|
||||||
ref := &pt.References[i]
|
ref := &pt.References[i]
|
||||||
if isoNode, ok := pt.IsolatedFiles[ref.File]; ok {
|
if isoNode, ok := pt.IsolatedFiles[ref.File]; ok {
|
||||||
ref.Target = pt.findNode(isoNode, ref.Name)
|
ref.Target = pt.FindNode(isoNode, ref.Name, nil)
|
||||||
} else {
|
} else {
|
||||||
ref.Target = pt.findNode(pt.Root, ref.Name)
|
ref.Target = pt.FindNode(pt.Root, ref.Name, nil)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func (pt *ProjectTree) findNode(root *ProjectNode, name string) *ProjectNode {
|
func (pt *ProjectTree) FindNode(root *ProjectNode, name string, predicate func(*ProjectNode) bool) *ProjectNode {
|
||||||
|
if strings.Contains(name, ".") {
|
||||||
|
parts := strings.Split(name, ".")
|
||||||
|
rootName := parts[0]
|
||||||
|
|
||||||
|
var candidates []*ProjectNode
|
||||||
|
pt.findAllNodes(root, rootName, &candidates)
|
||||||
|
|
||||||
|
for _, cand := range candidates {
|
||||||
|
curr := cand
|
||||||
|
valid := true
|
||||||
|
for i := 1; i < len(parts); i++ {
|
||||||
|
nextName := parts[i]
|
||||||
|
normNext := NormalizeName(nextName)
|
||||||
|
if child, ok := curr.Children[normNext]; ok {
|
||||||
|
curr = child
|
||||||
|
} else {
|
||||||
|
valid = false
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if valid {
|
||||||
|
if predicate == nil || predicate(curr) {
|
||||||
|
return curr
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
if root.RealName == name || root.Name == name {
|
if root.RealName == name || root.Name == name {
|
||||||
return root
|
if predicate == nil || predicate(root) {
|
||||||
|
return root
|
||||||
|
}
|
||||||
}
|
}
|
||||||
for _, child := range root.Children {
|
for _, child := range root.Children {
|
||||||
if res := pt.findNode(child, name); res != nil {
|
if res := pt.FindNode(child, name, predicate); res != nil {
|
||||||
return res
|
return res
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (pt *ProjectTree) findAllNodes(root *ProjectNode, name string, results *[]*ProjectNode) {
|
||||||
|
if root.RealName == name || root.Name == name {
|
||||||
|
*results = append(*results, root)
|
||||||
|
}
|
||||||
|
for _, child := range root.Children {
|
||||||
|
pt.findAllNodes(child, name, results)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
type QueryResult struct {
|
type QueryResult struct {
|
||||||
Node *ProjectNode
|
Node *ProjectNode
|
||||||
Field *parser.Field
|
Field *parser.Field
|
||||||
|
|||||||
@@ -7,15 +7,51 @@ import (
|
|||||||
"fmt"
|
"fmt"
|
||||||
"io"
|
"io"
|
||||||
"os"
|
"os"
|
||||||
|
"regexp"
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/formatter"
|
"github.com/marte-community/marte-dev-tools/internal/formatter"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/logger"
|
"github.com/marte-community/marte-dev-tools/internal/logger"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/schema"
|
||||||
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
|
|
||||||
|
"cuelang.org/go/cue"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
type CompletionParams struct {
|
||||||
|
TextDocument TextDocumentIdentifier `json:"textDocument"`
|
||||||
|
Position Position `json:"position"`
|
||||||
|
Context CompletionContext `json:"context,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
type CompletionContext struct {
|
||||||
|
TriggerKind int `json:"triggerKind"`
|
||||||
|
}
|
||||||
|
|
||||||
|
type CompletionItem struct {
|
||||||
|
Label string `json:"label"`
|
||||||
|
Kind int `json:"kind"`
|
||||||
|
Detail string `json:"detail,omitempty"`
|
||||||
|
Documentation string `json:"documentation,omitempty"`
|
||||||
|
InsertText string `json:"insertText,omitempty"`
|
||||||
|
InsertTextFormat int `json:"insertTextFormat,omitempty"` // 1: PlainText, 2: Snippet
|
||||||
|
SortText string `json:"sortText,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
type CompletionList struct {
|
||||||
|
IsIncomplete bool `json:"isIncomplete"`
|
||||||
|
Items []CompletionItem `json:"items"`
|
||||||
|
}
|
||||||
|
|
||||||
|
var Tree = index.NewProjectTree()
|
||||||
|
var Documents = make(map[string]string)
|
||||||
|
var ProjectRoot string
|
||||||
|
var GlobalSchema *schema.Schema
|
||||||
|
|
||||||
type JsonRpcMessage struct {
|
type JsonRpcMessage struct {
|
||||||
Jsonrpc string `json:"jsonrpc"`
|
Jsonrpc string `json:"jsonrpc"`
|
||||||
Method string `json:"method,omitempty"`
|
Method string `json:"method,omitempty"`
|
||||||
@@ -135,9 +171,6 @@ type TextEdit struct {
|
|||||||
NewText string `json:"newText"`
|
NewText string `json:"newText"`
|
||||||
}
|
}
|
||||||
|
|
||||||
var tree = index.NewProjectTree()
|
|
||||||
var documents = make(map[string]string)
|
|
||||||
var projectRoot string
|
|
||||||
|
|
||||||
func RunServer() {
|
func RunServer() {
|
||||||
reader := bufio.NewReader(os.Stdin)
|
reader := bufio.NewReader(os.Stdin)
|
||||||
@@ -151,7 +184,7 @@ func RunServer() {
|
|||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
|
|
||||||
handleMessage(msg)
|
HandleMessage(msg)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -181,7 +214,7 @@ func readMessage(reader *bufio.Reader) (*JsonRpcMessage, error) {
|
|||||||
return &msg, err
|
return &msg, err
|
||||||
}
|
}
|
||||||
|
|
||||||
func handleMessage(msg *JsonRpcMessage) {
|
func HandleMessage(msg *JsonRpcMessage) {
|
||||||
switch msg.Method {
|
switch msg.Method {
|
||||||
case "initialize":
|
case "initialize":
|
||||||
var params InitializeParams
|
var params InitializeParams
|
||||||
@@ -194,12 +227,13 @@ func handleMessage(msg *JsonRpcMessage) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
if root != "" {
|
if root != "" {
|
||||||
projectRoot = root
|
ProjectRoot = root
|
||||||
logger.Printf("Scanning workspace: %s\n", root)
|
logger.Printf("Scanning workspace: %s\n", root)
|
||||||
if err := tree.ScanDirectory(root); err != nil {
|
if err := Tree.ScanDirectory(root); err != nil {
|
||||||
logger.Printf("ScanDirectory failed: %v\n", err)
|
logger.Printf("ScanDirectory failed: %v\n", err)
|
||||||
}
|
}
|
||||||
tree.ResolveReferences()
|
Tree.ResolveReferences()
|
||||||
|
GlobalSchema = schema.LoadFullSchema(ProjectRoot)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -210,6 +244,9 @@ func handleMessage(msg *JsonRpcMessage) {
|
|||||||
"definitionProvider": true,
|
"definitionProvider": true,
|
||||||
"referencesProvider": true,
|
"referencesProvider": true,
|
||||||
"documentFormattingProvider": true,
|
"documentFormattingProvider": true,
|
||||||
|
"completionProvider": map[string]any{
|
||||||
|
"triggerCharacters": []string{"=", " "},
|
||||||
|
},
|
||||||
},
|
},
|
||||||
})
|
})
|
||||||
case "initialized":
|
case "initialized":
|
||||||
@@ -221,18 +258,18 @@ func handleMessage(msg *JsonRpcMessage) {
|
|||||||
case "textDocument/didOpen":
|
case "textDocument/didOpen":
|
||||||
var params DidOpenTextDocumentParams
|
var params DidOpenTextDocumentParams
|
||||||
if err := json.Unmarshal(msg.Params, ¶ms); err == nil {
|
if err := json.Unmarshal(msg.Params, ¶ms); err == nil {
|
||||||
handleDidOpen(params)
|
HandleDidOpen(params)
|
||||||
}
|
}
|
||||||
case "textDocument/didChange":
|
case "textDocument/didChange":
|
||||||
var params DidChangeTextDocumentParams
|
var params DidChangeTextDocumentParams
|
||||||
if err := json.Unmarshal(msg.Params, ¶ms); err == nil {
|
if err := json.Unmarshal(msg.Params, ¶ms); err == nil {
|
||||||
handleDidChange(params)
|
HandleDidChange(params)
|
||||||
}
|
}
|
||||||
case "textDocument/hover":
|
case "textDocument/hover":
|
||||||
var params HoverParams
|
var params HoverParams
|
||||||
if err := json.Unmarshal(msg.Params, ¶ms); err == nil {
|
if err := json.Unmarshal(msg.Params, ¶ms); err == nil {
|
||||||
logger.Printf("Hover: %s:%d", params.TextDocument.URI, params.Position.Line)
|
logger.Printf("Hover: %s:%d", params.TextDocument.URI, params.Position.Line)
|
||||||
res := handleHover(params)
|
res := HandleHover(params)
|
||||||
if res != nil {
|
if res != nil {
|
||||||
logger.Printf("Res: %v", res.Contents)
|
logger.Printf("Res: %v", res.Contents)
|
||||||
} else {
|
} else {
|
||||||
@@ -246,17 +283,22 @@ func handleMessage(msg *JsonRpcMessage) {
|
|||||||
case "textDocument/definition":
|
case "textDocument/definition":
|
||||||
var params DefinitionParams
|
var params DefinitionParams
|
||||||
if err := json.Unmarshal(msg.Params, ¶ms); err == nil {
|
if err := json.Unmarshal(msg.Params, ¶ms); err == nil {
|
||||||
respond(msg.ID, handleDefinition(params))
|
respond(msg.ID, HandleDefinition(params))
|
||||||
}
|
}
|
||||||
case "textDocument/references":
|
case "textDocument/references":
|
||||||
var params ReferenceParams
|
var params ReferenceParams
|
||||||
if err := json.Unmarshal(msg.Params, ¶ms); err == nil {
|
if err := json.Unmarshal(msg.Params, ¶ms); err == nil {
|
||||||
respond(msg.ID, handleReferences(params))
|
respond(msg.ID, HandleReferences(params))
|
||||||
|
}
|
||||||
|
case "textDocument/completion":
|
||||||
|
var params CompletionParams
|
||||||
|
if err := json.Unmarshal(msg.Params, ¶ms); err == nil {
|
||||||
|
respond(msg.ID, HandleCompletion(params))
|
||||||
}
|
}
|
||||||
case "textDocument/formatting":
|
case "textDocument/formatting":
|
||||||
var params DocumentFormattingParams
|
var params DocumentFormattingParams
|
||||||
if err := json.Unmarshal(msg.Params, ¶ms); err == nil {
|
if err := json.Unmarshal(msg.Params, ¶ms); err == nil {
|
||||||
respond(msg.ID, handleFormatting(params))
|
respond(msg.ID, HandleFormatting(params))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -265,41 +307,51 @@ func uriToPath(uri string) string {
|
|||||||
return strings.TrimPrefix(uri, "file://")
|
return strings.TrimPrefix(uri, "file://")
|
||||||
}
|
}
|
||||||
|
|
||||||
func handleDidOpen(params DidOpenTextDocumentParams) {
|
func HandleDidOpen(params DidOpenTextDocumentParams) {
|
||||||
path := uriToPath(params.TextDocument.URI)
|
path := uriToPath(params.TextDocument.URI)
|
||||||
documents[params.TextDocument.URI] = params.TextDocument.Text
|
Documents[params.TextDocument.URI] = params.TextDocument.Text
|
||||||
p := parser.NewParser(params.TextDocument.Text)
|
p := parser.NewParser(params.TextDocument.Text)
|
||||||
config, err := p.Parse()
|
config, err := p.Parse()
|
||||||
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
publishParserError(params.TextDocument.URI, err)
|
publishParserError(params.TextDocument.URI, err)
|
||||||
return
|
} else {
|
||||||
|
publishParserError(params.TextDocument.URI, nil)
|
||||||
|
}
|
||||||
|
|
||||||
|
if config != nil {
|
||||||
|
Tree.AddFile(path, config)
|
||||||
|
Tree.ResolveReferences()
|
||||||
|
runValidation(params.TextDocument.URI)
|
||||||
}
|
}
|
||||||
tree.AddFile(path, config)
|
|
||||||
tree.ResolveReferences()
|
|
||||||
runValidation(params.TextDocument.URI)
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func handleDidChange(params DidChangeTextDocumentParams) {
|
func HandleDidChange(params DidChangeTextDocumentParams) {
|
||||||
if len(params.ContentChanges) == 0 {
|
if len(params.ContentChanges) == 0 {
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
text := params.ContentChanges[0].Text
|
text := params.ContentChanges[0].Text
|
||||||
documents[params.TextDocument.URI] = text
|
Documents[params.TextDocument.URI] = text
|
||||||
path := uriToPath(params.TextDocument.URI)
|
path := uriToPath(params.TextDocument.URI)
|
||||||
p := parser.NewParser(text)
|
p := parser.NewParser(text)
|
||||||
config, err := p.Parse()
|
config, err := p.Parse()
|
||||||
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
publishParserError(params.TextDocument.URI, err)
|
publishParserError(params.TextDocument.URI, err)
|
||||||
return
|
} else {
|
||||||
|
publishParserError(params.TextDocument.URI, nil)
|
||||||
|
}
|
||||||
|
|
||||||
|
if config != nil {
|
||||||
|
Tree.AddFile(path, config)
|
||||||
|
Tree.ResolveReferences()
|
||||||
|
runValidation(params.TextDocument.URI)
|
||||||
}
|
}
|
||||||
tree.AddFile(path, config)
|
|
||||||
tree.ResolveReferences()
|
|
||||||
runValidation(params.TextDocument.URI)
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func handleFormatting(params DocumentFormattingParams) []TextEdit {
|
func HandleFormatting(params DocumentFormattingParams) []TextEdit {
|
||||||
uri := params.TextDocument.URI
|
uri := params.TextDocument.URI
|
||||||
text, ok := documents[uri]
|
text, ok := Documents[uri]
|
||||||
if !ok {
|
if !ok {
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
@@ -331,7 +383,7 @@ func handleFormatting(params DocumentFormattingParams) []TextEdit {
|
|||||||
}
|
}
|
||||||
|
|
||||||
func runValidation(uri string) {
|
func runValidation(uri string) {
|
||||||
v := validator.NewValidator(tree, projectRoot)
|
v := validator.NewValidator(Tree, ProjectRoot)
|
||||||
v.ValidateProject()
|
v.ValidateProject()
|
||||||
v.CheckUnused()
|
v.CheckUnused()
|
||||||
|
|
||||||
@@ -340,7 +392,7 @@ func runValidation(uri string) {
|
|||||||
|
|
||||||
// Collect all known files to ensure we clear diagnostics for fixed files
|
// Collect all known files to ensure we clear diagnostics for fixed files
|
||||||
knownFiles := make(map[string]bool)
|
knownFiles := make(map[string]bool)
|
||||||
collectFiles(tree.Root, knownFiles)
|
collectFiles(Tree.Root, knownFiles)
|
||||||
|
|
||||||
// Initialize all known files with empty diagnostics
|
// Initialize all known files with empty diagnostics
|
||||||
for f := range knownFiles {
|
for f := range knownFiles {
|
||||||
@@ -385,6 +437,19 @@ func runValidation(uri string) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
func publishParserError(uri string, err error) {
|
func publishParserError(uri string, err error) {
|
||||||
|
if err == nil {
|
||||||
|
notification := JsonRpcMessage{
|
||||||
|
Jsonrpc: "2.0",
|
||||||
|
Method: "textDocument/publishDiagnostics",
|
||||||
|
Params: mustMarshal(PublishDiagnosticsParams{
|
||||||
|
URI: uri,
|
||||||
|
Diagnostics: []LSPDiagnostic{},
|
||||||
|
}),
|
||||||
|
}
|
||||||
|
send(notification)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
var line, col int
|
var line, col int
|
||||||
var msg string
|
var msg string
|
||||||
// Try parsing "line:col: message"
|
// Try parsing "line:col: message"
|
||||||
@@ -436,12 +501,12 @@ func mustMarshal(v any) json.RawMessage {
|
|||||||
return b
|
return b
|
||||||
}
|
}
|
||||||
|
|
||||||
func handleHover(params HoverParams) *Hover {
|
func HandleHover(params HoverParams) *Hover {
|
||||||
path := uriToPath(params.TextDocument.URI)
|
path := uriToPath(params.TextDocument.URI)
|
||||||
line := params.Position.Line + 1
|
line := params.Position.Line + 1
|
||||||
col := params.Position.Character + 1
|
col := params.Position.Character + 1
|
||||||
|
|
||||||
res := tree.Query(path, line, col)
|
res := Tree.Query(path, line, col)
|
||||||
if res == nil {
|
if res == nil {
|
||||||
logger.Printf("No object/node/reference found")
|
logger.Printf("No object/node/reference found")
|
||||||
return nil
|
return nil
|
||||||
@@ -488,12 +553,314 @@ func handleHover(params HoverParams) *Hover {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func handleDefinition(params DefinitionParams) any {
|
func HandleCompletion(params CompletionParams) *CompletionList {
|
||||||
|
uri := params.TextDocument.URI
|
||||||
|
path := uriToPath(uri)
|
||||||
|
text, ok := Documents[uri]
|
||||||
|
if !ok {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
lines := strings.Split(text, "\n")
|
||||||
|
if params.Position.Line >= len(lines) {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
lineStr := lines[params.Position.Line]
|
||||||
|
|
||||||
|
col := params.Position.Character
|
||||||
|
if col > len(lineStr) {
|
||||||
|
col = len(lineStr)
|
||||||
|
}
|
||||||
|
|
||||||
|
prefix := lineStr[:col]
|
||||||
|
|
||||||
|
// Case 1: Assigning a value (Ends with "=" or "= ")
|
||||||
|
if strings.Contains(prefix, "=") {
|
||||||
|
lastIdx := strings.LastIndex(prefix, "=")
|
||||||
|
beforeEqual := prefix[:lastIdx]
|
||||||
|
|
||||||
|
// Find the last identifier before '='
|
||||||
|
key := ""
|
||||||
|
re := regexp.MustCompile(`[a-zA-Z][a-zA-Z0-9_\-]*`)
|
||||||
|
matches := re.FindAllString(beforeEqual, -1)
|
||||||
|
if len(matches) > 0 {
|
||||||
|
key = matches[len(matches)-1]
|
||||||
|
}
|
||||||
|
|
||||||
|
if key == "Class" {
|
||||||
|
return suggestClasses()
|
||||||
|
}
|
||||||
|
|
||||||
|
container := Tree.GetNodeContaining(path, parser.Position{Line: params.Position.Line + 1, Column: col + 1})
|
||||||
|
if container != nil {
|
||||||
|
return suggestFieldValues(container, key, path)
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Case 2: Typing a key inside an object
|
||||||
|
container := Tree.GetNodeContaining(path, parser.Position{Line: params.Position.Line + 1, Column: col + 1})
|
||||||
|
if container != nil {
|
||||||
|
return suggestFields(container)
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func suggestClasses() *CompletionList {
|
||||||
|
if GlobalSchema == nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
classesVal := GlobalSchema.Value.LookupPath(cue.ParsePath("#Classes"))
|
||||||
|
if classesVal.Err() != nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
iter, err := classesVal.Fields()
|
||||||
|
if err != nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
var items []CompletionItem
|
||||||
|
for iter.Next() {
|
||||||
|
label := iter.Selector().String()
|
||||||
|
label = strings.Trim(label, "?!#")
|
||||||
|
|
||||||
|
items = append(items, CompletionItem{
|
||||||
|
Label: label,
|
||||||
|
Kind: 7, // Class
|
||||||
|
Detail: "MARTe Class",
|
||||||
|
})
|
||||||
|
}
|
||||||
|
return &CompletionList{Items: items}
|
||||||
|
}
|
||||||
|
|
||||||
|
func suggestFields(container *index.ProjectNode) *CompletionList {
|
||||||
|
cls := container.Metadata["Class"]
|
||||||
|
if cls == "" {
|
||||||
|
return &CompletionList{Items: []CompletionItem{{
|
||||||
|
Label: "Class",
|
||||||
|
Kind: 10, // Property
|
||||||
|
InsertText: "Class = ",
|
||||||
|
Detail: "Define object class",
|
||||||
|
}}}
|
||||||
|
}
|
||||||
|
|
||||||
|
if GlobalSchema == nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
classPath := cue.ParsePath(fmt.Sprintf("#Classes.%s", cls))
|
||||||
|
classVal := GlobalSchema.Value.LookupPath(classPath)
|
||||||
|
if classVal.Err() != nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
iter, err := classVal.Fields()
|
||||||
|
if err != nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
existing := make(map[string]bool)
|
||||||
|
for _, frag := range container.Fragments {
|
||||||
|
for _, def := range frag.Definitions {
|
||||||
|
if f, ok := def.(*parser.Field); ok {
|
||||||
|
existing[f.Name] = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
for name := range container.Children {
|
||||||
|
existing[name] = true
|
||||||
|
}
|
||||||
|
|
||||||
|
var items []CompletionItem
|
||||||
|
for iter.Next() {
|
||||||
|
label := iter.Selector().String()
|
||||||
|
label = strings.Trim(label, "?!#")
|
||||||
|
|
||||||
|
// Skip if already present
|
||||||
|
if existing[label] {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
isOptional := iter.IsOptional()
|
||||||
|
kind := 10 // Property
|
||||||
|
detail := "Mandatory"
|
||||||
|
if isOptional {
|
||||||
|
detail = "Optional"
|
||||||
|
}
|
||||||
|
|
||||||
|
insertText := label + " = "
|
||||||
|
val := iter.Value()
|
||||||
|
if val.Kind() == cue.StructKind {
|
||||||
|
// Suggest as node
|
||||||
|
insertText = "+" + label + " = {\n\t$0\n}"
|
||||||
|
kind = 9 // Module
|
||||||
|
}
|
||||||
|
|
||||||
|
items = append(items, CompletionItem{
|
||||||
|
Label: label,
|
||||||
|
Kind: kind,
|
||||||
|
Detail: detail,
|
||||||
|
InsertText: insertText,
|
||||||
|
InsertTextFormat: 2, // Snippet
|
||||||
|
})
|
||||||
|
}
|
||||||
|
return &CompletionList{Items: items}
|
||||||
|
}
|
||||||
|
|
||||||
|
func suggestFieldValues(container *index.ProjectNode, field string, path string) *CompletionList {
|
||||||
|
var root *index.ProjectNode
|
||||||
|
if iso, ok := Tree.IsolatedFiles[path]; ok {
|
||||||
|
root = iso
|
||||||
|
} else {
|
||||||
|
root = Tree.Root
|
||||||
|
}
|
||||||
|
|
||||||
|
if field == "DataSource" {
|
||||||
|
return suggestObjects(root, "DataSource")
|
||||||
|
}
|
||||||
|
if field == "Functions" {
|
||||||
|
return suggestObjects(root, "GAM")
|
||||||
|
}
|
||||||
|
if field == "Type" {
|
||||||
|
return suggestSignalTypes()
|
||||||
|
}
|
||||||
|
|
||||||
|
if list := suggestCUEEnums(container, field); list != nil {
|
||||||
|
return list
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func suggestSignalTypes() *CompletionList {
|
||||||
|
types := []string{
|
||||||
|
"uint8", "int8", "uint16", "int16", "uint32", "int32", "uint64", "int64",
|
||||||
|
"float32", "float64", "string", "bool", "char8",
|
||||||
|
}
|
||||||
|
var items []CompletionItem
|
||||||
|
for _, t := range types {
|
||||||
|
items = append(items, CompletionItem{
|
||||||
|
Label: t,
|
||||||
|
Kind: 13, // EnumMember
|
||||||
|
Detail: "Signal Type",
|
||||||
|
})
|
||||||
|
}
|
||||||
|
return &CompletionList{Items: items}
|
||||||
|
}
|
||||||
|
|
||||||
|
func suggestCUEEnums(container *index.ProjectNode, field string) *CompletionList {
|
||||||
|
if GlobalSchema == nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
cls := container.Metadata["Class"]
|
||||||
|
if cls == "" {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
classPath := cue.ParsePath(fmt.Sprintf("#Classes.%s.%s", cls, field))
|
||||||
|
val := GlobalSchema.Value.LookupPath(classPath)
|
||||||
|
if val.Err() != nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
op, args := val.Expr()
|
||||||
|
var values []cue.Value
|
||||||
|
if op == cue.OrOp {
|
||||||
|
values = args
|
||||||
|
} else {
|
||||||
|
values = []cue.Value{val}
|
||||||
|
}
|
||||||
|
|
||||||
|
var items []CompletionItem
|
||||||
|
for _, v := range values {
|
||||||
|
if !v.IsConcrete() {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
str, err := v.String() // Returns quoted string for string values?
|
||||||
|
if err != nil {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Ensure strings are quoted
|
||||||
|
if v.Kind() == cue.StringKind && !strings.HasPrefix(str, "\"") {
|
||||||
|
str = fmt.Sprintf("\"%s\"", str)
|
||||||
|
}
|
||||||
|
|
||||||
|
items = append(items, CompletionItem{
|
||||||
|
Label: str,
|
||||||
|
Kind: 13, // EnumMember
|
||||||
|
Detail: "Enum Value",
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(items) > 0 {
|
||||||
|
return &CompletionList{Items: items}
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func suggestObjects(root *index.ProjectNode, filter string) *CompletionList {
|
||||||
|
if root == nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
var items []CompletionItem
|
||||||
|
|
||||||
|
var walk func(*index.ProjectNode)
|
||||||
|
walk = func(node *index.ProjectNode) {
|
||||||
|
match := false
|
||||||
|
if filter == "GAM" {
|
||||||
|
if isGAM(node) {
|
||||||
|
match = true
|
||||||
|
}
|
||||||
|
} else if filter == "DataSource" {
|
||||||
|
if isDataSource(node) {
|
||||||
|
match = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if match {
|
||||||
|
items = append(items, CompletionItem{
|
||||||
|
Label: node.Name,
|
||||||
|
Kind: 6, // Variable
|
||||||
|
Detail: node.Metadata["Class"],
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, child := range node.Children {
|
||||||
|
walk(child)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
walk(root)
|
||||||
|
return &CompletionList{Items: items}
|
||||||
|
}
|
||||||
|
|
||||||
|
func isGAM(node *index.ProjectNode) bool {
|
||||||
|
if node.RealName == "" || (node.RealName[0] != '+' && node.RealName[0] != '$') {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
_, hasInput := node.Children["InputSignals"]
|
||||||
|
_, hasOutput := node.Children["OutputSignals"]
|
||||||
|
return hasInput || hasOutput
|
||||||
|
}
|
||||||
|
|
||||||
|
func isDataSource(node *index.ProjectNode) bool {
|
||||||
|
if node.Parent != nil && node.Parent.Name == "Data" {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
_, hasSignals := node.Children["Signals"]
|
||||||
|
return hasSignals
|
||||||
|
}
|
||||||
|
|
||||||
|
func HandleDefinition(params DefinitionParams) any {
|
||||||
path := uriToPath(params.TextDocument.URI)
|
path := uriToPath(params.TextDocument.URI)
|
||||||
line := params.Position.Line + 1
|
line := params.Position.Line + 1
|
||||||
col := params.Position.Character + 1
|
col := params.Position.Character + 1
|
||||||
|
|
||||||
res := tree.Query(path, line, col)
|
res := Tree.Query(path, line, col)
|
||||||
if res == nil {
|
if res == nil {
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
@@ -528,12 +895,12 @@ func handleDefinition(params DefinitionParams) any {
|
|||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
func handleReferences(params ReferenceParams) []Location {
|
func HandleReferences(params ReferenceParams) []Location {
|
||||||
path := uriToPath(params.TextDocument.URI)
|
path := uriToPath(params.TextDocument.URI)
|
||||||
line := params.Position.Line + 1
|
line := params.Position.Line + 1
|
||||||
col := params.Position.Character + 1
|
col := params.Position.Character + 1
|
||||||
|
|
||||||
res := tree.Query(path, line, col)
|
res := Tree.Query(path, line, col)
|
||||||
if res == nil {
|
if res == nil {
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
@@ -571,7 +938,7 @@ func handleReferences(params ReferenceParams) []Location {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// 1. References from index (Aliases)
|
// 1. References from index (Aliases)
|
||||||
for _, ref := range tree.References {
|
for _, ref := range Tree.References {
|
||||||
if ref.Target == canonical {
|
if ref.Target == canonical {
|
||||||
locations = append(locations, Location{
|
locations = append(locations, Location{
|
||||||
URI: "file://" + ref.File,
|
URI: "file://" + ref.File,
|
||||||
@@ -584,7 +951,7 @@ func handleReferences(params ReferenceParams) []Location {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// 2. References from Node Targets (Direct References)
|
// 2. References from Node Targets (Direct References)
|
||||||
tree.Walk(func(node *index.ProjectNode) {
|
Tree.Walk(func(node *index.ProjectNode) {
|
||||||
if node.Target == canonical {
|
if node.Target == canonical {
|
||||||
for _, frag := range node.Fragments {
|
for _, frag := range node.Fragments {
|
||||||
if frag.IsObject {
|
if frag.IsObject {
|
||||||
@@ -638,9 +1005,9 @@ func formatNodeInfo(node *index.ProjectNode) string {
|
|||||||
|
|
||||||
// Find references
|
// Find references
|
||||||
var refs []string
|
var refs []string
|
||||||
for _, ref := range tree.References {
|
for _, ref := range Tree.References {
|
||||||
if ref.Target == node {
|
if ref.Target == node {
|
||||||
container := tree.GetNodeContaining(ref.File, ref.Position)
|
container := Tree.GetNodeContaining(ref.File, ref.Position)
|
||||||
if container != nil {
|
if container != nil {
|
||||||
threadName := ""
|
threadName := ""
|
||||||
stateName := ""
|
stateName := ""
|
||||||
|
|||||||
@@ -1,210 +0,0 @@
|
|||||||
package lsp
|
|
||||||
|
|
||||||
import (
|
|
||||||
"encoding/json"
|
|
||||||
"os"
|
|
||||||
"path/filepath"
|
|
||||||
"strings"
|
|
||||||
"testing"
|
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
|
||||||
)
|
|
||||||
|
|
||||||
func TestInitProjectScan(t *testing.T) {
|
|
||||||
// 1. Setup temp dir with files
|
|
||||||
tmpDir, err := os.MkdirTemp("", "lsp_test")
|
|
||||||
if err != nil {
|
|
||||||
t.Fatal(err)
|
|
||||||
}
|
|
||||||
defer os.RemoveAll(tmpDir)
|
|
||||||
|
|
||||||
// File 1: Definition
|
|
||||||
if err := os.WriteFile(filepath.Join(tmpDir, "def.marte"), []byte("#package Test.Common\n+Target = { Class = C }"), 0644); err != nil {
|
|
||||||
t.Fatal(err)
|
|
||||||
}
|
|
||||||
// File 2: Reference
|
|
||||||
// +Source = { Class = C Link = Target }
|
|
||||||
// Link = Target starts at index ...
|
|
||||||
// #package Test.Common (21 chars including newline)
|
|
||||||
// +Source = { Class = C Link = Target }
|
|
||||||
// 012345678901234567890123456789012345
|
|
||||||
// Previous offset was 29.
|
|
||||||
// Now add 21?
|
|
||||||
// #package Test.Common\n
|
|
||||||
// +Source = ...
|
|
||||||
// So add 21 to Character? Or Line 1?
|
|
||||||
// It's on Line 1 (0-based 1).
|
|
||||||
if err := os.WriteFile(filepath.Join(tmpDir, "ref.marte"), []byte("#package Test.Common\n+Source = { Class = C Link = Target }"), 0644); err != nil {
|
|
||||||
t.Fatal(err)
|
|
||||||
}
|
|
||||||
|
|
||||||
// 2. Initialize
|
|
||||||
tree = index.NewProjectTree() // Reset global tree
|
|
||||||
|
|
||||||
initParams := InitializeParams{RootPath: tmpDir}
|
|
||||||
paramsBytes, _ := json.Marshal(initParams)
|
|
||||||
|
|
||||||
msg := &JsonRpcMessage{
|
|
||||||
Method: "initialize",
|
|
||||||
Params: paramsBytes,
|
|
||||||
ID: 1,
|
|
||||||
}
|
|
||||||
|
|
||||||
handleMessage(msg)
|
|
||||||
|
|
||||||
// Query the reference in ref.marte at "Target"
|
|
||||||
// Target starts at index 29 (0-based) on Line 1
|
|
||||||
defParams := DefinitionParams{
|
|
||||||
TextDocument: TextDocumentIdentifier{URI: "file://" + filepath.Join(tmpDir, "ref.marte")},
|
|
||||||
Position: Position{Line: 1, Character: 29},
|
|
||||||
}
|
|
||||||
|
|
||||||
res := handleDefinition(defParams)
|
|
||||||
if res == nil {
|
|
||||||
t.Fatal("Definition not found via LSP after initialization")
|
|
||||||
}
|
|
||||||
|
|
||||||
locs, ok := res.([]Location)
|
|
||||||
if !ok {
|
|
||||||
t.Fatalf("Expected []Location, got %T", res)
|
|
||||||
}
|
|
||||||
|
|
||||||
if len(locs) == 0 {
|
|
||||||
t.Fatal("No locations found")
|
|
||||||
}
|
|
||||||
|
|
||||||
// Verify uri points to def.marte
|
|
||||||
expectedURI := "file://" + filepath.Join(tmpDir, "def.marte")
|
|
||||||
if locs[0].URI != expectedURI {
|
|
||||||
t.Errorf("Expected URI %s, got %s", expectedURI, locs[0].URI)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func TestHandleDefinition(t *testing.T) {
|
|
||||||
// Reset tree for test
|
|
||||||
tree = index.NewProjectTree()
|
|
||||||
|
|
||||||
content := `
|
|
||||||
+MyObject = {
|
|
||||||
Class = Type
|
|
||||||
}
|
|
||||||
+RefObject = {
|
|
||||||
Class = Type
|
|
||||||
RefField = MyObject
|
|
||||||
}
|
|
||||||
`
|
|
||||||
path := "/test.marte"
|
|
||||||
p := parser.NewParser(content)
|
|
||||||
config, err := p.Parse()
|
|
||||||
if err != nil {
|
|
||||||
t.Fatalf("Parse failed: %v", err)
|
|
||||||
}
|
|
||||||
tree.AddFile(path, config)
|
|
||||||
tree.ResolveReferences()
|
|
||||||
|
|
||||||
t.Logf("Refs: %d", len(tree.References))
|
|
||||||
for _, r := range tree.References {
|
|
||||||
t.Logf(" %s at %d:%d", r.Name, r.Position.Line, r.Position.Column)
|
|
||||||
}
|
|
||||||
|
|
||||||
// Test Go to Definition on MyObject reference
|
|
||||||
params := DefinitionParams{
|
|
||||||
TextDocument: TextDocumentIdentifier{URI: "file://" + path},
|
|
||||||
Position: Position{Line: 6, Character: 15}, // "MyObject" in RefField = MyObject
|
|
||||||
}
|
|
||||||
|
|
||||||
result := handleDefinition(params)
|
|
||||||
if result == nil {
|
|
||||||
t.Fatal("handleDefinition returned nil")
|
|
||||||
}
|
|
||||||
|
|
||||||
locations, ok := result.([]Location)
|
|
||||||
if !ok {
|
|
||||||
t.Fatalf("Expected []Location, got %T", result)
|
|
||||||
}
|
|
||||||
|
|
||||||
if len(locations) != 1 {
|
|
||||||
t.Fatalf("Expected 1 location, got %d", len(locations))
|
|
||||||
}
|
|
||||||
|
|
||||||
if locations[0].Range.Start.Line != 1 { // +MyObject is on line 2 (0-indexed 1)
|
|
||||||
t.Errorf("Expected definition on line 1, got %d", locations[0].Range.Start.Line)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func TestHandleReferences(t *testing.T) {
|
|
||||||
// Reset tree for test
|
|
||||||
tree = index.NewProjectTree()
|
|
||||||
|
|
||||||
content := `
|
|
||||||
+MyObject = {
|
|
||||||
Class = Type
|
|
||||||
}
|
|
||||||
+RefObject = {
|
|
||||||
Class = Type
|
|
||||||
RefField = MyObject
|
|
||||||
}
|
|
||||||
+AnotherRef = {
|
|
||||||
Ref = MyObject
|
|
||||||
}
|
|
||||||
`
|
|
||||||
path := "/test.marte"
|
|
||||||
p := parser.NewParser(content)
|
|
||||||
config, err := p.Parse()
|
|
||||||
if err != nil {
|
|
||||||
t.Fatalf("Parse failed: %v", err)
|
|
||||||
}
|
|
||||||
tree.AddFile(path, config)
|
|
||||||
tree.ResolveReferences()
|
|
||||||
|
|
||||||
// Test Find References for MyObject (triggered from its definition)
|
|
||||||
params := ReferenceParams{
|
|
||||||
TextDocument: TextDocumentIdentifier{URI: "file://" + path},
|
|
||||||
Position: Position{Line: 1, Character: 1}, // "+MyObject"
|
|
||||||
Context: ReferenceContext{IncludeDeclaration: true},
|
|
||||||
}
|
|
||||||
|
|
||||||
locations := handleReferences(params)
|
|
||||||
if len(locations) != 3 { // 1 declaration + 2 references
|
|
||||||
t.Fatalf("Expected 3 locations, got %d", len(locations))
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func TestLSPFormatting(t *testing.T) {
|
|
||||||
// Setup
|
|
||||||
content := `
|
|
||||||
#package Proj.Main
|
|
||||||
+Object={
|
|
||||||
Field=1
|
|
||||||
}
|
|
||||||
`
|
|
||||||
uri := "file:///test.marte"
|
|
||||||
|
|
||||||
// Open (populate documents map)
|
|
||||||
documents[uri] = content
|
|
||||||
|
|
||||||
// Format
|
|
||||||
params := DocumentFormattingParams{
|
|
||||||
TextDocument: TextDocumentIdentifier{URI: uri},
|
|
||||||
}
|
|
||||||
|
|
||||||
edits := handleFormatting(params)
|
|
||||||
|
|
||||||
if len(edits) != 1 {
|
|
||||||
t.Fatalf("Expected 1 edit, got %d", len(edits))
|
|
||||||
}
|
|
||||||
|
|
||||||
newText := edits[0].NewText
|
|
||||||
|
|
||||||
expected := `#package Proj.Main
|
|
||||||
|
|
||||||
+Object = {
|
|
||||||
Field = 1
|
|
||||||
}
|
|
||||||
`
|
|
||||||
// Normalize newlines for comparison just in case
|
|
||||||
if strings.TrimSpace(strings.ReplaceAll(newText, "\r\n", "\n")) != strings.TrimSpace(strings.ReplaceAll(expected, "\r\n", "\n")) {
|
|
||||||
t.Errorf("Formatting mismatch.\nExpected:\n%s\nGot:\n%s", expected, newText)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -11,6 +11,7 @@ type Parser struct {
|
|||||||
buf []Token
|
buf []Token
|
||||||
comments []Comment
|
comments []Comment
|
||||||
pragmas []Pragma
|
pragmas []Pragma
|
||||||
|
errors []error
|
||||||
}
|
}
|
||||||
|
|
||||||
func NewParser(input string) *Parser {
|
func NewParser(input string) *Parser {
|
||||||
@@ -19,6 +20,10 @@ func NewParser(input string) *Parser {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (p *Parser) addError(pos Position, msg string) {
|
||||||
|
p.errors = append(p.errors, fmt.Errorf("%d:%d: %s", pos.Line, pos.Column, msg))
|
||||||
|
}
|
||||||
|
|
||||||
func (p *Parser) next() Token {
|
func (p *Parser) next() Token {
|
||||||
if len(p.buf) > 0 {
|
if len(p.buf) > 0 {
|
||||||
t := p.buf[0]
|
t := p.buf[0]
|
||||||
@@ -71,72 +76,82 @@ func (p *Parser) Parse() (*Configuration, error) {
|
|||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
|
|
||||||
def, err := p.parseDefinition()
|
def, ok := p.parseDefinition()
|
||||||
if err != nil {
|
if ok {
|
||||||
return nil, err
|
config.Definitions = append(config.Definitions, def)
|
||||||
|
} else {
|
||||||
|
// Synchronization: skip token if not consumed to make progress
|
||||||
|
if p.peek() == tok {
|
||||||
|
p.next()
|
||||||
|
}
|
||||||
}
|
}
|
||||||
config.Definitions = append(config.Definitions, def)
|
|
||||||
}
|
}
|
||||||
config.Comments = p.comments
|
config.Comments = p.comments
|
||||||
config.Pragmas = p.pragmas
|
config.Pragmas = p.pragmas
|
||||||
return config, nil
|
|
||||||
|
var err error
|
||||||
|
if len(p.errors) > 0 {
|
||||||
|
err = p.errors[0]
|
||||||
|
}
|
||||||
|
return config, err
|
||||||
}
|
}
|
||||||
|
|
||||||
func (p *Parser) parseDefinition() (Definition, error) {
|
func (p *Parser) parseDefinition() (Definition, bool) {
|
||||||
tok := p.next()
|
tok := p.next()
|
||||||
switch tok.Type {
|
switch tok.Type {
|
||||||
case TokenIdentifier:
|
case TokenIdentifier:
|
||||||
// Could be Field = Value OR Node = { ... }
|
|
||||||
name := tok.Value
|
name := tok.Value
|
||||||
if p.next().Type != TokenEqual {
|
if p.peek().Type != TokenEqual {
|
||||||
return nil, fmt.Errorf("%d:%d: expected =", tok.Position.Line, tok.Position.Column)
|
p.addError(tok.Position, "expected =")
|
||||||
|
return nil, false
|
||||||
}
|
}
|
||||||
|
p.next() // Consume =
|
||||||
|
|
||||||
// Disambiguate based on RHS
|
|
||||||
nextTok := p.peek()
|
nextTok := p.peek()
|
||||||
if nextTok.Type == TokenLBrace {
|
if nextTok.Type == TokenLBrace {
|
||||||
// Check if it looks like a Subnode (contains definitions) or Array (contains values)
|
|
||||||
if p.isSubnodeLookahead() {
|
if p.isSubnodeLookahead() {
|
||||||
sub, err := p.parseSubnode()
|
sub, ok := p.parseSubnode()
|
||||||
if err != nil {
|
if !ok {
|
||||||
return nil, err
|
return nil, false
|
||||||
}
|
}
|
||||||
return &ObjectNode{
|
return &ObjectNode{
|
||||||
Position: tok.Position,
|
Position: tok.Position,
|
||||||
Name: name,
|
Name: name,
|
||||||
Subnode: sub,
|
Subnode: sub,
|
||||||
}, nil
|
}, true
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Default to Field
|
val, ok := p.parseValue()
|
||||||
val, err := p.parseValue()
|
if !ok {
|
||||||
if err != nil {
|
return nil, false
|
||||||
return nil, err
|
|
||||||
}
|
}
|
||||||
return &Field{
|
return &Field{
|
||||||
Position: tok.Position,
|
Position: tok.Position,
|
||||||
Name: name,
|
Name: name,
|
||||||
Value: val,
|
Value: val,
|
||||||
}, nil
|
}, true
|
||||||
|
|
||||||
case TokenObjectIdentifier:
|
case TokenObjectIdentifier:
|
||||||
// node = subnode
|
|
||||||
name := tok.Value
|
name := tok.Value
|
||||||
if p.next().Type != TokenEqual {
|
if p.peek().Type != TokenEqual {
|
||||||
return nil, fmt.Errorf("%d:%d: expected =", tok.Position.Line, tok.Position.Column)
|
p.addError(tok.Position, "expected =")
|
||||||
|
return nil, false
|
||||||
}
|
}
|
||||||
sub, err := p.parseSubnode()
|
p.next() // Consume =
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
sub, ok := p.parseSubnode()
|
||||||
|
if !ok {
|
||||||
|
return nil, false
|
||||||
}
|
}
|
||||||
return &ObjectNode{
|
return &ObjectNode{
|
||||||
Position: tok.Position,
|
Position: tok.Position,
|
||||||
Name: name,
|
Name: name,
|
||||||
Subnode: sub,
|
Subnode: sub,
|
||||||
}, nil
|
}, true
|
||||||
default:
|
default:
|
||||||
return nil, fmt.Errorf("%d:%d: unexpected token %v", tok.Position.Line, tok.Position.Column, tok.Value)
|
p.addError(tok.Position, fmt.Sprintf("unexpected token %v", tok.Value))
|
||||||
|
return nil, false
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -176,10 +191,11 @@ func (p *Parser) isSubnodeLookahead() bool {
|
|||||||
return false
|
return false
|
||||||
}
|
}
|
||||||
|
|
||||||
func (p *Parser) parseSubnode() (Subnode, error) {
|
func (p *Parser) parseSubnode() (Subnode, bool) {
|
||||||
tok := p.next()
|
tok := p.next()
|
||||||
if tok.Type != TokenLBrace {
|
if tok.Type != TokenLBrace {
|
||||||
return Subnode{}, fmt.Errorf("%d:%d: expected {", tok.Position.Line, tok.Position.Column)
|
p.addError(tok.Position, "expected {")
|
||||||
|
return Subnode{}, false
|
||||||
}
|
}
|
||||||
sub := Subnode{Position: tok.Position}
|
sub := Subnode{Position: tok.Position}
|
||||||
for {
|
for {
|
||||||
@@ -190,43 +206,45 @@ func (p *Parser) parseSubnode() (Subnode, error) {
|
|||||||
break
|
break
|
||||||
}
|
}
|
||||||
if t.Type == TokenEOF {
|
if t.Type == TokenEOF {
|
||||||
return sub, fmt.Errorf("%d:%d: unexpected EOF, expected }", t.Position.Line, t.Position.Column)
|
p.addError(t.Position, "unexpected EOF, expected }")
|
||||||
|
sub.EndPosition = t.Position
|
||||||
|
return sub, true
|
||||||
}
|
}
|
||||||
def, err := p.parseDefinition()
|
def, ok := p.parseDefinition()
|
||||||
if err != nil {
|
if ok {
|
||||||
return sub, err
|
sub.Definitions = append(sub.Definitions, def)
|
||||||
|
} else {
|
||||||
|
if p.peek() == t {
|
||||||
|
p.next()
|
||||||
|
}
|
||||||
}
|
}
|
||||||
sub.Definitions = append(sub.Definitions, def)
|
|
||||||
}
|
}
|
||||||
return sub, nil
|
return sub, true
|
||||||
}
|
}
|
||||||
|
|
||||||
func (p *Parser) parseValue() (Value, error) {
|
func (p *Parser) parseValue() (Value, bool) {
|
||||||
tok := p.next()
|
tok := p.next()
|
||||||
switch tok.Type {
|
switch tok.Type {
|
||||||
case TokenString:
|
case TokenString:
|
||||||
return &StringValue{
|
return &StringValue{
|
||||||
Position: tok.Position,
|
Position: tok.Position,
|
||||||
Value: strings.Trim(tok.Value, "\""),
|
Value: strings.Trim(tok.Value, "\""),
|
||||||
Quoted: true,
|
Quoted: true,
|
||||||
}, nil
|
}, true
|
||||||
|
|
||||||
case TokenNumber:
|
case TokenNumber:
|
||||||
// Simplistic handling
|
|
||||||
if strings.Contains(tok.Value, ".") || strings.Contains(tok.Value, "e") {
|
if strings.Contains(tok.Value, ".") || strings.Contains(tok.Value, "e") {
|
||||||
f, _ := strconv.ParseFloat(tok.Value, 64)
|
f, _ := strconv.ParseFloat(tok.Value, 64)
|
||||||
return &FloatValue{Position: tok.Position, Value: f, Raw: tok.Value}, nil
|
return &FloatValue{Position: tok.Position, Value: f, Raw: tok.Value}, true
|
||||||
}
|
}
|
||||||
i, _ := strconv.ParseInt(tok.Value, 0, 64)
|
i, _ := strconv.ParseInt(tok.Value, 0, 64)
|
||||||
return &IntValue{Position: tok.Position, Value: i, Raw: tok.Value}, nil
|
return &IntValue{Position: tok.Position, Value: i, Raw: tok.Value}, true
|
||||||
case TokenBool:
|
case TokenBool:
|
||||||
return &BoolValue{Position: tok.Position, Value: tok.Value == "true"},
|
return &BoolValue{Position: tok.Position, Value: tok.Value == "true"},
|
||||||
nil
|
true
|
||||||
case TokenIdentifier:
|
case TokenIdentifier:
|
||||||
// reference?
|
return &ReferenceValue{Position: tok.Position, Value: tok.Value}, true
|
||||||
return &ReferenceValue{Position: tok.Position, Value: tok.Value}, nil
|
|
||||||
case TokenLBrace:
|
case TokenLBrace:
|
||||||
// array
|
|
||||||
arr := &ArrayValue{Position: tok.Position}
|
arr := &ArrayValue{Position: tok.Position}
|
||||||
for {
|
for {
|
||||||
t := p.peek()
|
t := p.peek()
|
||||||
@@ -239,14 +257,15 @@ func (p *Parser) parseValue() (Value, error) {
|
|||||||
p.next()
|
p.next()
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
val, err := p.parseValue()
|
val, ok := p.parseValue()
|
||||||
if err != nil {
|
if !ok {
|
||||||
return nil, err
|
return nil, false
|
||||||
}
|
}
|
||||||
arr.Elements = append(arr.Elements, val)
|
arr.Elements = append(arr.Elements, val)
|
||||||
}
|
}
|
||||||
return arr, nil
|
return arr, true
|
||||||
default:
|
default:
|
||||||
return nil, fmt.Errorf("%d:%d: unexpected value token %v", tok.Position.Line, tok.Position.Column, tok.Value)
|
p.addError(tok.Position, fmt.Sprintf("unexpected value token %v", tok.Value))
|
||||||
|
return nil, false
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
297
internal/schema/marte.cue
Normal file
297
internal/schema/marte.cue
Normal file
@@ -0,0 +1,297 @@
|
|||||||
|
package schema
|
||||||
|
|
||||||
|
#Classes: {
|
||||||
|
RealTimeApplication: {
|
||||||
|
Functions: {...} // type: node
|
||||||
|
Data!: {...} // type: node
|
||||||
|
States!: {...} // type: node
|
||||||
|
...
|
||||||
|
}
|
||||||
|
Message: {
|
||||||
|
...
|
||||||
|
}
|
||||||
|
StateMachineEvent: {
|
||||||
|
NextState!: string
|
||||||
|
NextStateError!: string
|
||||||
|
Timeout: uint32
|
||||||
|
[_= !~"^(Class|NextState|Timeout|NextStateError|[#_$].+)$"]: Message
|
||||||
|
...
|
||||||
|
}
|
||||||
|
_State: {
|
||||||
|
Class: "ReferenceContainer"
|
||||||
|
ENTER?: {
|
||||||
|
Class: "ReferenceContainer"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
[_ = !~"^(Class|ENTER)$"]: StateMachineEvent
|
||||||
|
...
|
||||||
|
}
|
||||||
|
StateMachine: {
|
||||||
|
[_ = !~"^(Class|[$].*)$"]: _State
|
||||||
|
...
|
||||||
|
}
|
||||||
|
RealTimeState: {
|
||||||
|
Threads: {...} // type: node
|
||||||
|
...
|
||||||
|
}
|
||||||
|
RealTimeThread: {
|
||||||
|
Functions: [...] // type: array
|
||||||
|
...
|
||||||
|
}
|
||||||
|
GAMScheduler: {
|
||||||
|
TimingDataSource: string // type: reference
|
||||||
|
...
|
||||||
|
}
|
||||||
|
TimingDataSource: {
|
||||||
|
direction: "IN"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
IOGAM: {
|
||||||
|
InputSignals?: {...} // type: node
|
||||||
|
OutputSignals?: {...} // type: node
|
||||||
|
...
|
||||||
|
}
|
||||||
|
ReferenceContainer: {
|
||||||
|
...
|
||||||
|
}
|
||||||
|
ConstantGAM: {
|
||||||
|
...
|
||||||
|
}
|
||||||
|
PIDGAM: {
|
||||||
|
Kp: float | int // type: float (allow int as it promotes)
|
||||||
|
Ki: float | int
|
||||||
|
Kd: float | int
|
||||||
|
...
|
||||||
|
}
|
||||||
|
FileDataSource: {
|
||||||
|
Filename: string
|
||||||
|
Format?: string
|
||||||
|
direction: "INOUT"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
LoggerDataSource: {
|
||||||
|
direction: "OUT"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
DANStream: {
|
||||||
|
Timeout?: int
|
||||||
|
direction: "OUT"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
EPICSCAInput: {
|
||||||
|
direction: "IN"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
EPICSCAOutput: {
|
||||||
|
direction: "OUT"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
EPICSPVAInput: {
|
||||||
|
direction: "IN"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
EPICSPVAOutput: {
|
||||||
|
direction: "OUT"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
SDNSubscriber: {
|
||||||
|
Address: string
|
||||||
|
Port: int
|
||||||
|
Interface?: string
|
||||||
|
direction: "IN"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
SDNPublisher: {
|
||||||
|
Address: string
|
||||||
|
Port: int
|
||||||
|
Interface?: string
|
||||||
|
direction: "OUT"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
UDPReceiver: {
|
||||||
|
Port: int
|
||||||
|
Address?: string
|
||||||
|
direction: "IN"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
UDPSender: {
|
||||||
|
Destination: string
|
||||||
|
direction: "OUT"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
FileReader: {
|
||||||
|
Filename: string
|
||||||
|
Format?: string
|
||||||
|
Interpolate?: string
|
||||||
|
direction: "IN"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
FileWriter: {
|
||||||
|
Filename: string
|
||||||
|
Format?: string
|
||||||
|
StoreOnTrigger?: int
|
||||||
|
direction: "OUT"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
OrderedClass: {
|
||||||
|
First: int
|
||||||
|
Second: string
|
||||||
|
...
|
||||||
|
}
|
||||||
|
BaseLib2GAM: {...}
|
||||||
|
ConversionGAM: {...}
|
||||||
|
DoubleHandshakeGAM: {...}
|
||||||
|
FilterGAM: {
|
||||||
|
Num: [...]
|
||||||
|
Den: [...]
|
||||||
|
ResetInEachState?: _
|
||||||
|
InputSignals?: {...}
|
||||||
|
OutputSignals?: {...}
|
||||||
|
...
|
||||||
|
}
|
||||||
|
HistogramGAM: {
|
||||||
|
BeginCycleNumber?: int
|
||||||
|
StateChangeResetName?: string
|
||||||
|
InputSignals?: {...}
|
||||||
|
OutputSignals?: {...}
|
||||||
|
...
|
||||||
|
}
|
||||||
|
Interleaved2FlatGAM: {...}
|
||||||
|
FlattenedStructIOGAM: {...}
|
||||||
|
MathExpressionGAM: {
|
||||||
|
Expression: string
|
||||||
|
InputSignals?: {...}
|
||||||
|
OutputSignals?: {...}
|
||||||
|
...
|
||||||
|
}
|
||||||
|
MessageGAM: {...}
|
||||||
|
MuxGAM: {...}
|
||||||
|
SimulinkWrapperGAM: {...}
|
||||||
|
SSMGAM: {...}
|
||||||
|
StatisticsGAM: {...}
|
||||||
|
TimeCorrectionGAM: {...}
|
||||||
|
TriggeredIOGAM: {...}
|
||||||
|
WaveformGAM: {...}
|
||||||
|
DAN: {
|
||||||
|
direction: "OUT"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
LinuxTimer: {
|
||||||
|
ExecutionMode?: string
|
||||||
|
SleepNature?: string
|
||||||
|
SleepPercentage?: _
|
||||||
|
Phase?: int
|
||||||
|
CPUMask?: int
|
||||||
|
TimeProvider?: {...}
|
||||||
|
Signals: {...}
|
||||||
|
direction: "IN"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
LinkDataSource: {
|
||||||
|
direction: "INOUT"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
MDSReader: {
|
||||||
|
TreeName: string
|
||||||
|
ShotNumber: int
|
||||||
|
Frequency: float | int
|
||||||
|
Signals: {...}
|
||||||
|
direction: "IN"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
MDSWriter: {
|
||||||
|
NumberOfBuffers: int
|
||||||
|
CPUMask: int
|
||||||
|
StackSize: int
|
||||||
|
TreeName: string
|
||||||
|
PulseNumber?: int
|
||||||
|
StoreOnTrigger: int
|
||||||
|
EventName: string
|
||||||
|
TimeRefresh: float | int
|
||||||
|
NumberOfPreTriggers?: int
|
||||||
|
NumberOfPostTriggers?: int
|
||||||
|
Signals: {...}
|
||||||
|
Messages?: {...}
|
||||||
|
direction: "OUT"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
NI1588TimeStamp: {
|
||||||
|
direction: "IN"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
NI6259ADC: {
|
||||||
|
direction: "IN"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
NI6259DAC: {
|
||||||
|
direction: "OUT"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
NI6259DIO: {
|
||||||
|
direction: "INOUT"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
NI6368ADC: {
|
||||||
|
direction: "IN"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
NI6368DAC: {
|
||||||
|
direction: "OUT"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
NI6368DIO: {
|
||||||
|
direction: "INOUT"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
NI9157CircularFifoReader: {
|
||||||
|
direction: "IN"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
NI9157MxiDataSource: {
|
||||||
|
direction: "INOUT"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
OPCUADSInput: {
|
||||||
|
direction: "IN"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
OPCUADSOutput: {
|
||||||
|
direction: "OUT"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
RealTimeThreadAsyncBridge: {...}
|
||||||
|
RealTimeThreadSynchronisation: {...}
|
||||||
|
UARTDataSource: {
|
||||||
|
direction: "INOUT"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
BaseLib2Wrapper: {...}
|
||||||
|
EPICSCAClient: {...}
|
||||||
|
EPICSPVA: {...}
|
||||||
|
MemoryGate: {...}
|
||||||
|
OPCUA: {...}
|
||||||
|
SysLogger: {...}
|
||||||
|
GAMDataSource: {
|
||||||
|
direction: "INOUT"
|
||||||
|
...
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Definition for any Object.
|
||||||
|
// It must have a Class field.
|
||||||
|
// Based on Class, it validates against #Classes.
|
||||||
|
#Object: {
|
||||||
|
Class: string
|
||||||
|
// Allow any other field by default (extensibility),
|
||||||
|
// unless #Classes definition is closed.
|
||||||
|
// We allow open structs now.
|
||||||
|
...
|
||||||
|
|
||||||
|
// Unify if Class is known.
|
||||||
|
// If Class is NOT in #Classes, this might fail or do nothing depending on CUE logic.
|
||||||
|
// Actually, `#Classes[Class]` fails if key is missing.
|
||||||
|
// This ensures we validate against known classes.
|
||||||
|
// If we want to allow unknown classes, we need a check.
|
||||||
|
// But spec implies validation should check known classes.
|
||||||
|
#Classes[Class]
|
||||||
|
}
|
||||||
@@ -1,237 +0,0 @@
|
|||||||
{
|
|
||||||
"classes": {
|
|
||||||
"RealTimeApplication": {
|
|
||||||
"fields": [
|
|
||||||
{"name": "Functions", "type": "node", "mandatory": true},
|
|
||||||
{"name": "Data", "type": "node", "mandatory": true},
|
|
||||||
{"name": "States", "type": "node", "mandatory": true}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"StateMachine": {
|
|
||||||
"fields": [
|
|
||||||
{"name": "States", "type": "node", "mandatory": false}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"RealTimeState": {
|
|
||||||
"fields": [
|
|
||||||
{"name": "Threads", "type": "node", "mandatory": true}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"RealTimeThread": {
|
|
||||||
"fields": [
|
|
||||||
{"name": "Functions", "type": "array", "mandatory": true}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"GAMScheduler": {
|
|
||||||
"fields": [
|
|
||||||
{"name": "TimingDataSource", "type": "reference", "mandatory": true}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"TimingDataSource": {
|
|
||||||
"fields": [],
|
|
||||||
"direction": "IN"
|
|
||||||
},
|
|
||||||
"IOGAM": {
|
|
||||||
"fields": [
|
|
||||||
{"name": "InputSignals", "type": "node", "mandatory": false},
|
|
||||||
{"name": "OutputSignals", "type": "node", "mandatory": false}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"ReferenceContainer": {
|
|
||||||
"fields": []
|
|
||||||
},
|
|
||||||
"ConstantGAM": {
|
|
||||||
"fields": []
|
|
||||||
},
|
|
||||||
"PIDGAM": {
|
|
||||||
"fields": [
|
|
||||||
{"name": "Kp", "type": "float", "mandatory": true},
|
|
||||||
{"name": "Ki", "type": "float", "mandatory": true},
|
|
||||||
{"name": "Kd", "type": "float", "mandatory": true}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"FileDataSource": {
|
|
||||||
"fields": [
|
|
||||||
{"name": "Filename", "type": "string", "mandatory": true},
|
|
||||||
{"name": "Format", "type": "string", "mandatory": false}
|
|
||||||
],
|
|
||||||
"direction": "INOUT"
|
|
||||||
},
|
|
||||||
"LoggerDataSource": {
|
|
||||||
"fields": [],
|
|
||||||
"direction": "OUT"
|
|
||||||
},
|
|
||||||
"DANStream": {
|
|
||||||
"fields": [
|
|
||||||
{"name": "Timeout", "type": "int", "mandatory": false}
|
|
||||||
],
|
|
||||||
"direction": "OUT"
|
|
||||||
},
|
|
||||||
"EPICSCAInput": {
|
|
||||||
"fields": [],
|
|
||||||
"direction": "IN"
|
|
||||||
},
|
|
||||||
"EPICSCAOutput": {
|
|
||||||
"fields": [],
|
|
||||||
"direction": "OUT"
|
|
||||||
},
|
|
||||||
"EPICSPVAInput": {
|
|
||||||
"fields": [],
|
|
||||||
"direction": "IN"
|
|
||||||
},
|
|
||||||
"EPICSPVAOutput": {
|
|
||||||
"fields": [],
|
|
||||||
"direction": "OUT"
|
|
||||||
},
|
|
||||||
"SDNSubscriber": {
|
|
||||||
"fields": [
|
|
||||||
{"name": "Address", "type": "string", "mandatory": true},
|
|
||||||
{"name": "Port", "type": "int", "mandatory": true},
|
|
||||||
{"name": "Interface", "type": "string", "mandatory": false}
|
|
||||||
],
|
|
||||||
"direction": "IN"
|
|
||||||
},
|
|
||||||
"SDNPublisher": {
|
|
||||||
"fields": [
|
|
||||||
{"name": "Address", "type": "string", "mandatory": true},
|
|
||||||
{"name": "Port", "type": "int", "mandatory": true},
|
|
||||||
{"name": "Interface", "type": "string", "mandatory": false}
|
|
||||||
],
|
|
||||||
"direction": "OUT"
|
|
||||||
},
|
|
||||||
"UDPReceiver": {
|
|
||||||
"fields": [
|
|
||||||
{"name": "Port", "type": "int", "mandatory": true},
|
|
||||||
{"name": "Address", "type": "string", "mandatory": false}
|
|
||||||
],
|
|
||||||
"direction": "IN"
|
|
||||||
},
|
|
||||||
"UDPSender": {
|
|
||||||
"fields": [
|
|
||||||
{"name": "Destination", "type": "string", "mandatory": true}
|
|
||||||
],
|
|
||||||
"direction": "OUT"
|
|
||||||
},
|
|
||||||
"FileReader": {
|
|
||||||
"fields": [
|
|
||||||
{"name": "Filename", "type": "string", "mandatory": true},
|
|
||||||
{"name": "Format", "type": "string", "mandatory": false},
|
|
||||||
{"name": "Interpolate", "type": "string", "mandatory": false}
|
|
||||||
],
|
|
||||||
"direction": "IN"
|
|
||||||
},
|
|
||||||
"FileWriter": {
|
|
||||||
"fields": [
|
|
||||||
{"name": "Filename", "type": "string", "mandatory": true},
|
|
||||||
{"name": "Format", "type": "string", "mandatory": false},
|
|
||||||
{"name": "StoreOnTrigger", "type": "int", "mandatory": false}
|
|
||||||
],
|
|
||||||
"direction": "OUT"
|
|
||||||
},
|
|
||||||
"OrderedClass": {
|
|
||||||
"ordered": true,
|
|
||||||
"fields": [
|
|
||||||
{"name": "First", "type": "int", "mandatory": true},
|
|
||||||
{"name": "Second", "type": "string", "mandatory": true}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"BaseLib2GAM": { "fields": [] },
|
|
||||||
"ConversionGAM": { "fields": [] },
|
|
||||||
"DoubleHandshakeGAM": { "fields": [] },
|
|
||||||
"FilterGAM": {
|
|
||||||
"fields": [
|
|
||||||
{"name": "Num", "type": "array", "mandatory": true},
|
|
||||||
{"name": "Den", "type": "array", "mandatory": true},
|
|
||||||
{"name": "ResetInEachState", "type": "any", "mandatory": false},
|
|
||||||
{"name": "InputSignals", "type": "node", "mandatory": false},
|
|
||||||
{"name": "OutputSignals", "type": "node", "mandatory": false}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"HistogramGAM": {
|
|
||||||
"fields": [
|
|
||||||
{"name": "BeginCycleNumber", "type": "int", "mandatory": false},
|
|
||||||
{"name": "StateChangeResetName", "type": "string", "mandatory": false},
|
|
||||||
{"name": "InputSignals", "type": "node", "mandatory": false},
|
|
||||||
{"name": "OutputSignals", "type": "node", "mandatory": false}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"Interleaved2FlatGAM": { "fields": [] },
|
|
||||||
"FlattenedStructIOGAM": { "fields": [] },
|
|
||||||
"MathExpressionGAM": {
|
|
||||||
"fields": [
|
|
||||||
{"name": "Expression", "type": "string", "mandatory": true},
|
|
||||||
{"name": "InputSignals", "type": "node", "mandatory": false},
|
|
||||||
{"name": "OutputSignals", "type": "node", "mandatory": false}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"MessageGAM": { "fields": [] },
|
|
||||||
"MuxGAM": { "fields": [] },
|
|
||||||
"SimulinkWrapperGAM": { "fields": [] },
|
|
||||||
"SSMGAM": { "fields": [] },
|
|
||||||
"StatisticsGAM": { "fields": [] },
|
|
||||||
"TimeCorrectionGAM": { "fields": [] },
|
|
||||||
"TriggeredIOGAM": { "fields": [] },
|
|
||||||
"WaveformGAM": { "fields": [] },
|
|
||||||
"DAN": { "fields": [], "direction": "OUT" },
|
|
||||||
"LinuxTimer": {
|
|
||||||
"fields": [
|
|
||||||
{"name": "ExecutionMode", "type": "string", "mandatory": false},
|
|
||||||
{"name": "SleepNature", "type": "string", "mandatory": false},
|
|
||||||
{"name": "SleepPercentage", "type": "any", "mandatory": false},
|
|
||||||
{"name": "Phase", "type": "int", "mandatory": false},
|
|
||||||
{"name": "CPUMask", "type": "int", "mandatory": false},
|
|
||||||
{"name": "TimeProvider", "type": "node", "mandatory": false},
|
|
||||||
{"name": "Signals", "type": "node", "mandatory": true}
|
|
||||||
],
|
|
||||||
"direction": "IN"
|
|
||||||
},
|
|
||||||
"LinkDataSource": { "fields": [], "direction": "INOUT" },
|
|
||||||
"MDSReader": {
|
|
||||||
"fields": [
|
|
||||||
{"name": "TreeName", "type": "string", "mandatory": true},
|
|
||||||
{"name": "ShotNumber", "type": "int", "mandatory": true},
|
|
||||||
{"name": "Frequency", "type": "float", "mandatory": true},
|
|
||||||
{"name": "Signals", "type": "node", "mandatory": true}
|
|
||||||
],
|
|
||||||
"direction": "IN"
|
|
||||||
},
|
|
||||||
"MDSWriter": {
|
|
||||||
"fields": [
|
|
||||||
{"name": "NumberOfBuffers", "type": "int", "mandatory": true},
|
|
||||||
{"name": "CPUMask", "type": "int", "mandatory": true},
|
|
||||||
{"name": "StackSize", "type": "int", "mandatory": true},
|
|
||||||
{"name": "TreeName", "type": "string", "mandatory": true},
|
|
||||||
{"name": "PulseNumber", "type": "int", "mandatory": false},
|
|
||||||
{"name": "StoreOnTrigger", "type": "int", "mandatory": true},
|
|
||||||
{"name": "EventName", "type": "string", "mandatory": true},
|
|
||||||
{"name": "TimeRefresh", "type": "float", "mandatory": true},
|
|
||||||
{"name": "NumberOfPreTriggers", "type": "int", "mandatory": false},
|
|
||||||
{"name": "NumberOfPostTriggers", "type": "int", "mandatory": false},
|
|
||||||
{"name": "Signals", "type": "node", "mandatory": true},
|
|
||||||
{"name": "Messages", "type": "node", "mandatory": false}
|
|
||||||
],
|
|
||||||
"direction": "OUT"
|
|
||||||
},
|
|
||||||
"NI1588TimeStamp": { "fields": [], "direction": "IN" },
|
|
||||||
"NI6259ADC": { "fields": [], "direction": "IN" },
|
|
||||||
"NI6259DAC": { "fields": [], "direction": "OUT" },
|
|
||||||
"NI6259DIO": { "fields": [], "direction": "INOUT" },
|
|
||||||
"NI6368ADC": { "fields": [], "direction": "IN" },
|
|
||||||
"NI6368DAC": { "fields": [], "direction": "OUT" },
|
|
||||||
"NI6368DIO": { "fields": [], "direction": "INOUT" },
|
|
||||||
"NI9157CircularFifoReader": { "fields": [], "direction": "IN" },
|
|
||||||
"NI9157MxiDataSource": { "fields": [], "direction": "INOUT" },
|
|
||||||
"OPCUADSInput": { "fields": [], "direction": "IN" },
|
|
||||||
"OPCUADSOutput": { "fields": [], "direction": "OUT" },
|
|
||||||
"RealTimeThreadAsyncBridge": { "fields": [] },
|
|
||||||
"RealTimeThreadSynchronisation": { "fields": [] },
|
|
||||||
"UARTDataSource": { "fields": [], "direction": "INOUT" },
|
|
||||||
"BaseLib2Wrapper": { "fields": [] },
|
|
||||||
"EPICSCAClient": { "fields": [] },
|
|
||||||
"EPICSPVA": { "fields": [] },
|
|
||||||
"MemoryGate": { "fields": [] },
|
|
||||||
"OPCUA": { "fields": [] },
|
|
||||||
"SysLogger": { "fields": [] },
|
|
||||||
"GAMDataSource": { "fields": [], "direction": "INOUT" }
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -2,137 +2,73 @@ package schema
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
_ "embed"
|
_ "embed"
|
||||||
"encoding/json"
|
|
||||||
"fmt"
|
"fmt"
|
||||||
"os"
|
"os"
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
|
|
||||||
|
"cuelang.org/go/cue"
|
||||||
|
"cuelang.org/go/cue/cuecontext"
|
||||||
)
|
)
|
||||||
|
|
||||||
//go:embed marte.json
|
//go:embed marte.cue
|
||||||
var defaultSchemaJSON []byte
|
var defaultSchemaCUE []byte
|
||||||
|
|
||||||
type Schema struct {
|
type Schema struct {
|
||||||
Classes map[string]ClassDefinition `json:"classes"`
|
Context *cue.Context
|
||||||
}
|
Value cue.Value
|
||||||
|
|
||||||
type ClassDefinition struct {
|
|
||||||
Fields []FieldDefinition `json:"fields"`
|
|
||||||
Ordered bool `json:"ordered"`
|
|
||||||
Direction string `json:"direction"`
|
|
||||||
}
|
|
||||||
|
|
||||||
type FieldDefinition struct {
|
|
||||||
Name string `json:"name"`
|
|
||||||
Type string `json:"type"` // "int", "float", "string", "bool", "reference", "array", "node", "any"
|
|
||||||
Mandatory bool `json:"mandatory"`
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func NewSchema() *Schema {
|
func NewSchema() *Schema {
|
||||||
|
ctx := cuecontext.New()
|
||||||
return &Schema{
|
return &Schema{
|
||||||
Classes: make(map[string]ClassDefinition),
|
Context: ctx,
|
||||||
|
Value: ctx.CompileBytes(defaultSchemaCUE),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func LoadSchema(path string) (*Schema, error) {
|
// LoadSchema loads a CUE schema from a file and returns the cue.Value
|
||||||
|
func LoadSchema(ctx *cue.Context, path string) (cue.Value, error) {
|
||||||
content, err := os.ReadFile(path)
|
content, err := os.ReadFile(path)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return cue.Value{}, err
|
||||||
}
|
|
||||||
|
|
||||||
var s Schema
|
|
||||||
if err := json.Unmarshal(content, &s); err != nil {
|
|
||||||
return nil, fmt.Errorf("failed to parse schema: %v", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
return &s, nil
|
|
||||||
}
|
|
||||||
|
|
||||||
// DefaultSchema returns the built-in embedded schema
|
|
||||||
func DefaultSchema() *Schema {
|
|
||||||
var s Schema
|
|
||||||
if err := json.Unmarshal(defaultSchemaJSON, &s); err != nil {
|
|
||||||
panic(fmt.Sprintf("failed to parse default embedded schema: %v", err))
|
|
||||||
}
|
|
||||||
if s.Classes == nil {
|
|
||||||
s.Classes = make(map[string]ClassDefinition)
|
|
||||||
}
|
|
||||||
return &s
|
|
||||||
}
|
|
||||||
|
|
||||||
// Merge adds rules from 'other' to 's'.
|
|
||||||
// Rules for the same class are merged (new fields added, existing fields updated).
|
|
||||||
func (s *Schema) Merge(other *Schema) {
|
|
||||||
if other == nil {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
for className, classDef := range other.Classes {
|
|
||||||
if existingClass, ok := s.Classes[className]; ok {
|
|
||||||
// Merge fields
|
|
||||||
fieldMap := make(map[string]FieldDefinition)
|
|
||||||
for _, f := range classDef.Fields {
|
|
||||||
fieldMap[f.Name] = f
|
|
||||||
}
|
|
||||||
|
|
||||||
var mergedFields []FieldDefinition
|
|
||||||
seen := make(map[string]bool)
|
|
||||||
|
|
||||||
// Keep existing fields, update if present in other
|
|
||||||
for _, f := range existingClass.Fields {
|
|
||||||
if newF, ok := fieldMap[f.Name]; ok {
|
|
||||||
mergedFields = append(mergedFields, newF)
|
|
||||||
} else {
|
|
||||||
mergedFields = append(mergedFields, f)
|
|
||||||
}
|
|
||||||
seen[f.Name] = true
|
|
||||||
}
|
|
||||||
|
|
||||||
// Append new fields
|
|
||||||
for _, f := range classDef.Fields {
|
|
||||||
if !seen[f.Name] {
|
|
||||||
mergedFields = append(mergedFields, f)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
existingClass.Fields = mergedFields
|
|
||||||
if classDef.Ordered {
|
|
||||||
existingClass.Ordered = true
|
|
||||||
}
|
|
||||||
if classDef.Direction != "" {
|
|
||||||
existingClass.Direction = classDef.Direction
|
|
||||||
}
|
|
||||||
s.Classes[className] = existingClass
|
|
||||||
} else {
|
|
||||||
s.Classes[className] = classDef
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
return ctx.CompileBytes(content), nil
|
||||||
}
|
}
|
||||||
|
|
||||||
func LoadFullSchema(projectRoot string) *Schema {
|
func LoadFullSchema(projectRoot string) *Schema {
|
||||||
s := DefaultSchema()
|
ctx := cuecontext.New()
|
||||||
|
baseVal := ctx.CompileBytes(defaultSchemaCUE)
|
||||||
|
if baseVal.Err() != nil {
|
||||||
|
// Fallback or panic? Panic is appropriate for embedded schema failure
|
||||||
|
panic(fmt.Sprintf("Embedded schema invalid: %v", baseVal.Err()))
|
||||||
|
}
|
||||||
|
|
||||||
// 1. System Paths
|
// 1. System Paths
|
||||||
sysPaths := []string{
|
sysPaths := []string{
|
||||||
"/usr/share/mdt/marte_schema.json",
|
"/usr/share/mdt/marte_schema.cue",
|
||||||
}
|
}
|
||||||
|
|
||||||
home, err := os.UserHomeDir()
|
home, err := os.UserHomeDir()
|
||||||
if err == nil {
|
if err == nil {
|
||||||
sysPaths = append(sysPaths, filepath.Join(home, ".local/share/mdt/marte_schema.json"))
|
sysPaths = append(sysPaths, filepath.Join(home, ".local/share/mdt/marte_schema.cue"))
|
||||||
}
|
}
|
||||||
|
|
||||||
for _, path := range sysPaths {
|
for _, path := range sysPaths {
|
||||||
if sysSchema, err := LoadSchema(path); err == nil {
|
if val, err := LoadSchema(ctx, path); err == nil && val.Err() == nil {
|
||||||
s.Merge(sysSchema)
|
baseVal = baseVal.Unify(val)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// 2. Project Path
|
// 2. Project Path
|
||||||
if projectRoot != "" {
|
if projectRoot != "" {
|
||||||
projectSchemaPath := filepath.Join(projectRoot, ".marte_schema.json")
|
projectSchemaPath := filepath.Join(projectRoot, ".marte_schema.cue")
|
||||||
if projSchema, err := LoadSchema(projectSchemaPath); err == nil {
|
if val, err := LoadSchema(ctx, projectSchemaPath); err == nil && val.Err() == nil {
|
||||||
s.Merge(projSchema)
|
baseVal = baseVal.Unify(val)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
return s
|
return &Schema{
|
||||||
|
Context: ctx,
|
||||||
|
Value: baseVal,
|
||||||
|
}
|
||||||
}
|
}
|
||||||
@@ -2,11 +2,15 @@ package validator
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"fmt"
|
"fmt"
|
||||||
|
"strconv"
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"cuelang.org/go/cue"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"cuelang.org/go/cue/errors"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/schema"
|
|
||||||
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
|
"github.com/marte-community/marte-dev-tools/internal/schema"
|
||||||
)
|
)
|
||||||
|
|
||||||
type DiagnosticLevel int
|
type DiagnosticLevel int
|
||||||
@@ -68,37 +72,9 @@ func (v *Validator) validateNode(node *index.ProjectNode) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Collect fields and their definitions
|
|
||||||
fields := v.getFields(node)
|
fields := v.getFields(node)
|
||||||
fieldOrder := []string{}
|
|
||||||
for _, frag := range node.Fragments {
|
|
||||||
for _, def := range frag.Definitions {
|
|
||||||
if f, ok := def.(*parser.Field); ok {
|
|
||||||
if _, exists := fields[f.Name]; exists { // already collected
|
|
||||||
// Maintain order logic if needed, but getFields collects all.
|
|
||||||
// For strict order check we might need this loop.
|
|
||||||
// Let's assume getFields is enough for validation logic,
|
|
||||||
// but for "duplicate check" and "class validation" we iterate fields map.
|
|
||||||
// We need to construct fieldOrder.
|
|
||||||
// Just reuse loop for fieldOrder
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
// Re-construct fieldOrder for order validation
|
|
||||||
seen := make(map[string]bool)
|
|
||||||
for _, frag := range node.Fragments {
|
|
||||||
for _, def := range frag.Definitions {
|
|
||||||
if f, ok := def.(*parser.Field); ok {
|
|
||||||
if !seen[f.Name] {
|
|
||||||
fieldOrder = append(fieldOrder, f.Name)
|
|
||||||
seen[f.Name] = true
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// 1. Check for duplicate fields
|
// 1. Check for duplicate fields (Go logic)
|
||||||
for name, defs := range fields {
|
for name, defs := range fields {
|
||||||
if len(defs) > 1 {
|
if len(defs) > 1 {
|
||||||
firstFile := v.getFileForField(defs[0], node)
|
firstFile := v.getFileForField(defs[0], node)
|
||||||
@@ -139,11 +115,9 @@ func (v *Validator) validateNode(node *index.ProjectNode) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// 3. Schema Validation
|
// 3. CUE Validation
|
||||||
if className != "" && v.Schema != nil {
|
if className != "" && v.Schema != nil {
|
||||||
if classDef, ok := v.Schema.Classes[className]; ok {
|
v.validateWithCUE(node, className)
|
||||||
v.validateClass(node, classDef, fields, fieldOrder)
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// 4. Signal Validation (for DataSource signals)
|
// 4. Signal Validation (for DataSource signals)
|
||||||
@@ -162,68 +136,95 @@ func (v *Validator) validateNode(node *index.ProjectNode) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func (v *Validator) validateClass(node *index.ProjectNode, classDef schema.ClassDefinition, fields map[string][]*parser.Field, fieldOrder []string) {
|
func (v *Validator) validateWithCUE(node *index.ProjectNode, className string) {
|
||||||
// ... (same as before)
|
// Check if class exists in schema
|
||||||
for _, fieldDef := range classDef.Fields {
|
classPath := cue.ParsePath(fmt.Sprintf("#Classes.%s", className))
|
||||||
if fieldDef.Mandatory {
|
if v.Schema.Value.LookupPath(classPath).Err() != nil {
|
||||||
found := false
|
return // Unknown class, skip validation
|
||||||
if _, ok := fields[fieldDef.Name]; ok {
|
}
|
||||||
found = true
|
|
||||||
} else if fieldDef.Type == "node" {
|
|
||||||
if _, ok := node.Children[fieldDef.Name]; ok {
|
|
||||||
found = true
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if !found {
|
// Convert node to map
|
||||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
data := v.nodeToMap(node)
|
||||||
Level: LevelError,
|
|
||||||
Message: fmt.Sprintf("Missing mandatory field '%s' for class '%s'", fieldDef.Name, node.Metadata["Class"]),
|
// Encode data to CUE
|
||||||
Position: v.getNodePosition(node),
|
dataVal := v.Schema.Context.Encode(data)
|
||||||
File: v.getNodeFile(node),
|
|
||||||
})
|
// Unify with #Object
|
||||||
}
|
// #Object requires "Class" field, which is present in data.
|
||||||
|
objDef := v.Schema.Value.LookupPath(cue.ParsePath("#Object"))
|
||||||
|
|
||||||
|
// Unify
|
||||||
|
res := objDef.Unify(dataVal)
|
||||||
|
|
||||||
|
if err := res.Validate(cue.Concrete(true)); err != nil {
|
||||||
|
// Report errors
|
||||||
|
|
||||||
|
// Parse CUE error to diagnostic
|
||||||
|
v.reportCUEError(err, node)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (v *Validator) reportCUEError(err error, node *index.ProjectNode) {
|
||||||
|
list := errors.Errors(err)
|
||||||
|
for _, e := range list {
|
||||||
|
msg := e.Error()
|
||||||
|
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||||
|
Level: LevelError,
|
||||||
|
Message: fmt.Sprintf("Schema Validation Error: %v", msg),
|
||||||
|
Position: v.getNodePosition(node),
|
||||||
|
File: v.getNodeFile(node),
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (v *Validator) nodeToMap(node *index.ProjectNode) map[string]interface{} {
|
||||||
|
m := make(map[string]interface{})
|
||||||
|
fields := v.getFields(node)
|
||||||
|
|
||||||
|
for name, defs := range fields {
|
||||||
|
if len(defs) > 0 {
|
||||||
|
// Use the last definition (duplicates checked elsewhere)
|
||||||
|
m[name] = v.valueToInterface(defs[len(defs)-1].Value)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
for _, fieldDef := range classDef.Fields {
|
// Children as nested maps?
|
||||||
if fList, ok := fields[fieldDef.Name]; ok {
|
// CUE schema expects nested structs for "node" type fields.
|
||||||
f := fList[0]
|
// But `node.Children` contains ALL children (even those defined as +Child).
|
||||||
if !v.checkType(f.Value, fieldDef.Type) {
|
// If schema expects `States: { ... }`, we map children.
|
||||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
|
||||||
Level: LevelError,
|
for name, child := range node.Children {
|
||||||
Message: fmt.Sprintf("Field '%s' expects type '%s'", fieldDef.Name, fieldDef.Type),
|
// normalize name? CUE keys are strings.
|
||||||
Position: f.Position,
|
// If child real name is "+States", key in Children is "States".
|
||||||
File: v.getFileForField(f, node),
|
// We use "States" as key in map.
|
||||||
})
|
m[name] = v.nodeToMap(child)
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if classDef.Ordered {
|
return m
|
||||||
schemaIdx := 0
|
}
|
||||||
for _, nodeFieldName := range fieldOrder {
|
|
||||||
foundInSchema := false
|
func (v *Validator) valueToInterface(val parser.Value) interface{} {
|
||||||
for i, fd := range classDef.Fields {
|
switch t := val.(type) {
|
||||||
if fd.Name == nodeFieldName {
|
case *parser.StringValue:
|
||||||
foundInSchema = true
|
return t.Value
|
||||||
if i < schemaIdx {
|
case *parser.IntValue:
|
||||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
i, _ := strconv.ParseInt(t.Raw, 0, 64)
|
||||||
Level: LevelError,
|
return i // CUE handles int64
|
||||||
Message: fmt.Sprintf("Field '%s' is out of order", nodeFieldName),
|
case *parser.FloatValue:
|
||||||
Position: fields[nodeFieldName][0].Position,
|
f, _ := strconv.ParseFloat(t.Raw, 64)
|
||||||
File: v.getFileForField(fields[nodeFieldName][0], node),
|
return f
|
||||||
})
|
case *parser.BoolValue:
|
||||||
} else {
|
return t.Value
|
||||||
schemaIdx = i
|
case *parser.ReferenceValue:
|
||||||
}
|
return t.Value
|
||||||
break
|
case *parser.ArrayValue:
|
||||||
}
|
var arr []interface{}
|
||||||
}
|
for _, e := range t.Elements {
|
||||||
if !foundInSchema {
|
arr = append(arr, v.valueToInterface(e))
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
return arr
|
||||||
}
|
}
|
||||||
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
func (v *Validator) validateSignal(node *index.ProjectNode, fields map[string][]*parser.Field) {
|
func (v *Validator) validateSignal(node *index.ProjectNode, fields map[string][]*parser.Field) {
|
||||||
@@ -308,12 +309,17 @@ func (v *Validator) validateGAMSignal(gamNode, signalNode *index.ProjectNode, di
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Check Direction
|
// Check Direction using CUE Schema
|
||||||
dsClass := v.getNodeClass(dsNode)
|
dsClass := v.getNodeClass(dsNode)
|
||||||
if dsClass != "" {
|
if dsClass != "" {
|
||||||
if classDef, ok := v.Schema.Classes[dsClass]; ok {
|
// Lookup class definition in Schema
|
||||||
dsDir := classDef.Direction
|
// path: #Classes.ClassName.direction
|
||||||
if dsDir != "" {
|
path := cue.ParsePath(fmt.Sprintf("#Classes.%s.direction", dsClass))
|
||||||
|
val := v.Schema.Value.LookupPath(path)
|
||||||
|
|
||||||
|
if val.Err() == nil {
|
||||||
|
dsDir, err := val.String()
|
||||||
|
if err == nil && dsDir != "" {
|
||||||
if direction == "Input" && dsDir == "OUT" {
|
if direction == "Input" && dsDir == "OUT" {
|
||||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||||
Level: LevelError,
|
Level: LevelError,
|
||||||
@@ -509,7 +515,7 @@ func (v *Validator) getFieldValue(f *parser.Field) string {
|
|||||||
|
|
||||||
func (v *Validator) resolveReference(name string, file string, predicate func(*index.ProjectNode) bool) *index.ProjectNode {
|
func (v *Validator) resolveReference(name string, file string, predicate func(*index.ProjectNode) bool) *index.ProjectNode {
|
||||||
if isoNode, ok := v.Tree.IsolatedFiles[file]; ok {
|
if isoNode, ok := v.Tree.IsolatedFiles[file]; ok {
|
||||||
if found := v.findNodeRecursive(isoNode, name, predicate); found != nil {
|
if found := v.Tree.FindNode(isoNode, name, predicate); found != nil {
|
||||||
return found
|
return found
|
||||||
}
|
}
|
||||||
return nil
|
return nil
|
||||||
@@ -517,24 +523,7 @@ func (v *Validator) resolveReference(name string, file string, predicate func(*i
|
|||||||
if v.Tree.Root == nil {
|
if v.Tree.Root == nil {
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
return v.findNodeRecursive(v.Tree.Root, name, predicate)
|
return v.Tree.FindNode(v.Tree.Root, name, predicate)
|
||||||
}
|
|
||||||
|
|
||||||
func (v *Validator) findNodeRecursive(root *index.ProjectNode, name string, predicate func(*index.ProjectNode) bool) *index.ProjectNode {
|
|
||||||
// Simple recursive search matching name
|
|
||||||
if root.RealName == name || root.Name == index.NormalizeName(name) {
|
|
||||||
if predicate == nil || predicate(root) {
|
|
||||||
return root
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Recursive
|
|
||||||
for _, child := range root.Children {
|
|
||||||
if found := v.findNodeRecursive(child, name, predicate); found != nil {
|
|
||||||
return found
|
|
||||||
}
|
|
||||||
}
|
|
||||||
return nil
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (v *Validator) getNodeClass(node *index.ProjectNode) string {
|
func (v *Validator) getNodeClass(node *index.ProjectNode) string {
|
||||||
@@ -554,32 +543,7 @@ func isValidType(t string) bool {
|
|||||||
}
|
}
|
||||||
|
|
||||||
func (v *Validator) checkType(val parser.Value, expectedType string) bool {
|
func (v *Validator) checkType(val parser.Value, expectedType string) bool {
|
||||||
// ... (same as before)
|
// Legacy function, replaced by CUE.
|
||||||
switch expectedType {
|
|
||||||
case "int":
|
|
||||||
_, ok := val.(*parser.IntValue)
|
|
||||||
return ok
|
|
||||||
case "float":
|
|
||||||
_, ok := val.(*parser.FloatValue)
|
|
||||||
return ok
|
|
||||||
case "string":
|
|
||||||
_, okStr := val.(*parser.StringValue)
|
|
||||||
_, okRef := val.(*parser.ReferenceValue)
|
|
||||||
return okStr || okRef
|
|
||||||
case "bool":
|
|
||||||
_, ok := val.(*parser.BoolValue)
|
|
||||||
return ok
|
|
||||||
case "array":
|
|
||||||
_, ok := val.(*parser.ArrayValue)
|
|
||||||
return ok
|
|
||||||
case "reference":
|
|
||||||
_, ok := val.(*parser.ReferenceValue)
|
|
||||||
return ok
|
|
||||||
case "node":
|
|
||||||
return true
|
|
||||||
case "any":
|
|
||||||
return true
|
|
||||||
}
|
|
||||||
return true
|
return true
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -696,7 +660,8 @@ func isDataSource(node *index.ProjectNode) bool {
|
|||||||
if node.Parent != nil && node.Parent.Name == "Data" {
|
if node.Parent != nil && node.Parent.Name == "Data" {
|
||||||
return true
|
return true
|
||||||
}
|
}
|
||||||
return false
|
_, hasSignals := node.Children["Signals"]
|
||||||
|
return hasSignals
|
||||||
}
|
}
|
||||||
|
|
||||||
func isSignal(node *index.ProjectNode) bool {
|
func isSignal(node *index.ProjectNode) bool {
|
||||||
|
|||||||
@@ -29,7 +29,12 @@ The LSP server should provide the following capabilities:
|
|||||||
- **Go to Definition**: Jump to the definition of a reference, supporting navigation across any file in the current project.
|
- **Go to Definition**: Jump to the definition of a reference, supporting navigation across any file in the current project.
|
||||||
- **Go to References**: Find usages of a node or field, supporting navigation across any file in the current project.
|
- **Go to References**: Find usages of a node or field, supporting navigation across any file in the current project.
|
||||||
- **Code Completion**: Autocomplete fields, values, and references.
|
- **Code Completion**: Autocomplete fields, values, and references.
|
||||||
- **Code Snippets**: Provide snippets for common patterns.
|
- **Context-Aware**: Suggestions depend on the cursor position (e.g., inside an object, assigning a value).
|
||||||
|
- **Schema-Driven**: Field suggestions are derived from the CUE schema for the current object's Class, indicating mandatory vs. optional fields.
|
||||||
|
- **Reference Suggestions**:
|
||||||
|
- `DataSource` fields suggest available DataSource objects.
|
||||||
|
- `Functions` (in Threads) suggest available GAM objects.
|
||||||
|
- **Code Snippets**: Provide snippets for common patterns (e.g., `+Object = { ... }`).
|
||||||
- **Formatting**: Format the document using the same rules and engine as the `fmt` command.
|
- **Formatting**: Format the document using the same rules and engine as the `fmt` command.
|
||||||
|
|
||||||
## Build System & File Structure
|
## Build System & File Structure
|
||||||
@@ -47,9 +52,9 @@ The LSP server should provide the following capabilities:
|
|||||||
- **Namespace Consistency**: The build tool must verify that all input files belong to the same project namespace (the first segment of the `#package` URI). If multiple project namespaces are detected, the build must fail with an error.
|
- **Namespace Consistency**: The build tool must verify that all input files belong to the same project namespace (the first segment of the `#package` URI). If multiple project namespaces are detected, the build must fail with an error.
|
||||||
- **Target**: The build output is written to a single target file (e.g., provided via CLI or API).
|
- **Target**: The build output is written to a single target file (e.g., provided via CLI or API).
|
||||||
- **Multi-File Definitions**: Nodes and objects can be defined across multiple files. The build tool, validator, and LSP must merge these definitions (including all fields and sub-nodes) from the entire project to create a unified view before processing or validating.
|
- **Multi-File Definitions**: Nodes and objects can be defined across multiple files. The build tool, validator, and LSP must merge these definitions (including all fields and sub-nodes) from the entire project to create a unified view before processing or validating.
|
||||||
- **Global References**: References to nodes, signals, or objects can point to definitions located in any file within the project.
|
- **Global References**: References to nodes, signals, or objects can point to definitions located in any file within the project. Support for dot-separated paths (e.g., `Node.SubNode`) is required.
|
||||||
- **Merging Order**: For objects defined across multiple files, the **first file** to be considered is the one containing the `Class` field definition.
|
- **Merging Order**: For objects defined across multiple files, definitions are merged. The build tool must preserve the relative order of fields and sub-nodes as they appear in the source files, interleaving them correctly in the final output.
|
||||||
- **Field Order**: Within a single file, the relative order of defined fields must be maintained.
|
- **Field Order**: Within a single file (and across merged files), the relative order of defined fields must be maintained in the output.
|
||||||
- The LSP indexes only files belonging to the same project/namespace scope.
|
- The LSP indexes only files belonging to the same project/namespace scope.
|
||||||
- **Output**: The output format is the same as the input configuration but without the `#package` macro.
|
- **Output**: The output format is the same as the input configuration but without the `#package` macro.
|
||||||
|
|
||||||
@@ -160,13 +165,13 @@ The tool must build an index of the configuration to support LSP features and va
|
|||||||
- **Field Order**: Verification that specific fields appear in a prescribed order when required by the class definition.
|
- **Field Order**: Verification that specific fields appear in a prescribed order when required by the class definition.
|
||||||
- **Conditional Fields**: Validation of fields whose presence or value depends on the values of other fields within the same node or context.
|
- **Conditional Fields**: Validation of fields whose presence or value depends on the values of other fields within the same node or context.
|
||||||
- **Schema Definition**:
|
- **Schema Definition**:
|
||||||
- Class validation rules must be defined in a separate schema file.
|
- Class validation rules must be defined in a separate schema file using the **CUE** language.
|
||||||
- **Project-Specific Classes**: Developers can define their own project-specific classes and corresponding validation rules, expanding the validation capabilities for their specific needs.
|
- **Project-Specific Classes**: Developers can define their own project-specific classes and corresponding validation rules, expanding the validation capabilities for their specific needs.
|
||||||
- **Schema Loading**:
|
- **Schema Loading**:
|
||||||
- **Default Schema**: The tool should look for a default schema file `marte_schema.json` in standard system locations:
|
- **Default Schema**: The tool should look for a default schema file `marte_schema.cue` in standard system locations:
|
||||||
- `/usr/share/mdt/marte_schema.json`
|
- `/usr/share/mdt/marte_schema.cue`
|
||||||
- `$HOME/.local/share/mdt/marte_schema.json`
|
- `$HOME/.local/share/mdt/marte_schema.cue`
|
||||||
- **Project Schema**: If a file named `.marte_schema.json` exists in the project root, it must be loaded.
|
- **Project Schema**: If a file named `.marte_schema.cue` exists in the project root, it must be loaded.
|
||||||
- **Merging**: The final schema is a merge of the built-in schema, the system default schema (if found), and the project-specific schema. Rules in later sources (Project > System > Built-in) append to or override earlier ones.
|
- **Merging**: The final schema is a merge of the built-in schema, the system default schema (if found), and the project-specific schema. Rules in later sources (Project > System > Built-in) append to or override earlier ones.
|
||||||
- **Duplicate Fields**:
|
- **Duplicate Fields**:
|
||||||
- **Constraint**: A field must not be defined more than once within the same object/node scope, even if those definitions are spread across different files.
|
- **Constraint**: A field must not be defined more than once within the same object/node scope, even if those definitions are spread across different files.
|
||||||
|
|||||||
@@ -5,7 +5,7 @@ import (
|
|||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/builder"
|
"github.com/marte-community/marte-dev-tools/internal/builder"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestMultiFileBuildMergeAndOrder(t *testing.T) {
|
func TestMultiFileBuildMergeAndOrder(t *testing.T) {
|
||||||
|
|||||||
@@ -7,11 +7,11 @@ import (
|
|||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/builder"
|
"github.com/marte-community/marte-dev-tools/internal/builder"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/formatter"
|
"github.com/marte-community/marte-dev-tools/internal/formatter"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestCheckCommand(t *testing.T) {
|
func TestCheckCommand(t *testing.T) {
|
||||||
|
|||||||
320
test/lsp_completion_test.go
Normal file
320
test/lsp_completion_test.go
Normal file
@@ -0,0 +1,320 @@
|
|||||||
|
package integration
|
||||||
|
|
||||||
|
import (
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
|
"github.com/marte-community/marte-dev-tools/internal/lsp"
|
||||||
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
|
"github.com/marte-community/marte-dev-tools/internal/schema"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestHandleCompletion(t *testing.T) {
|
||||||
|
setup := func() {
|
||||||
|
lsp.Tree = index.NewProjectTree()
|
||||||
|
lsp.Documents = make(map[string]string)
|
||||||
|
lsp.ProjectRoot = "."
|
||||||
|
lsp.GlobalSchema = schema.NewSchema()
|
||||||
|
}
|
||||||
|
|
||||||
|
uri := "file://test.marte"
|
||||||
|
path := "test.marte"
|
||||||
|
|
||||||
|
t.Run("Suggest Classes", func(t *testing.T) {
|
||||||
|
setup()
|
||||||
|
content := "+Obj = { Class = "
|
||||||
|
lsp.Documents[uri] = content
|
||||||
|
|
||||||
|
params := lsp.CompletionParams{
|
||||||
|
TextDocument: lsp.TextDocumentIdentifier{URI: uri},
|
||||||
|
Position: lsp.Position{Line: 0, Character: len(content)},
|
||||||
|
}
|
||||||
|
|
||||||
|
list := lsp.HandleCompletion(params)
|
||||||
|
if list == nil || len(list.Items) == 0 {
|
||||||
|
t.Fatal("Expected class suggestions, got none")
|
||||||
|
}
|
||||||
|
|
||||||
|
found := false
|
||||||
|
for _, item := range list.Items {
|
||||||
|
if item.Label == "RealTimeApplication" {
|
||||||
|
found = true
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if !found {
|
||||||
|
t.Error("Expected RealTimeApplication in class suggestions")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Suggest Fields", func(t *testing.T) {
|
||||||
|
setup()
|
||||||
|
content := `
|
||||||
|
+MyApp = {
|
||||||
|
Class = RealTimeApplication
|
||||||
|
|
||||||
|
}
|
||||||
|
`
|
||||||
|
lsp.Documents[uri] = content
|
||||||
|
p := parser.NewParser(content)
|
||||||
|
cfg, _ := p.Parse()
|
||||||
|
lsp.Tree.AddFile(path, cfg)
|
||||||
|
|
||||||
|
// Position at line 3 (empty line inside MyApp)
|
||||||
|
params := lsp.CompletionParams{
|
||||||
|
TextDocument: lsp.TextDocumentIdentifier{URI: uri},
|
||||||
|
Position: lsp.Position{Line: 3, Character: 4},
|
||||||
|
}
|
||||||
|
|
||||||
|
list := lsp.HandleCompletion(params)
|
||||||
|
if list == nil || len(list.Items) == 0 {
|
||||||
|
t.Fatal("Expected field suggestions, got none")
|
||||||
|
}
|
||||||
|
|
||||||
|
foundData := false
|
||||||
|
for _, item := range list.Items {
|
||||||
|
if item.Label == "Data" {
|
||||||
|
foundData = true
|
||||||
|
if item.Detail != "Mandatory" {
|
||||||
|
t.Errorf("Expected Data to be Mandatory, got %s", item.Detail)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if !foundData {
|
||||||
|
t.Error("Expected 'Data' in field suggestions for RealTimeApplication")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Suggest References (DataSource)", func(t *testing.T) {
|
||||||
|
setup()
|
||||||
|
content := `
|
||||||
|
$App = {
|
||||||
|
$Data = {
|
||||||
|
+InDS = {
|
||||||
|
Class = FileReader
|
||||||
|
+Signals = {
|
||||||
|
Sig1 = { Type = uint32 }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
+MyGAM = {
|
||||||
|
Class = IOGAM
|
||||||
|
+InputSignals = {
|
||||||
|
S1 = { DataSource = }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
`
|
||||||
|
lsp.Documents[uri] = content
|
||||||
|
p := parser.NewParser(content)
|
||||||
|
cfg, _ := p.Parse()
|
||||||
|
lsp.Tree.AddFile(path, cfg)
|
||||||
|
lsp.Tree.ResolveReferences()
|
||||||
|
|
||||||
|
// Position at end of "DataSource = "
|
||||||
|
params := lsp.CompletionParams{
|
||||||
|
TextDocument: lsp.TextDocumentIdentifier{URI: uri},
|
||||||
|
Position: lsp.Position{Line: 14, Character: 28},
|
||||||
|
}
|
||||||
|
|
||||||
|
list := lsp.HandleCompletion(params)
|
||||||
|
if list == nil || len(list.Items) == 0 {
|
||||||
|
t.Fatal("Expected DataSource suggestions, got none")
|
||||||
|
}
|
||||||
|
|
||||||
|
foundDS := false
|
||||||
|
for _, item := range list.Items {
|
||||||
|
if item.Label == "InDS" {
|
||||||
|
foundDS = true
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if !foundDS {
|
||||||
|
t.Error("Expected 'InDS' in suggestions for DataSource field")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Filter Existing Fields", func(t *testing.T) {
|
||||||
|
setup()
|
||||||
|
content := `
|
||||||
|
+MyThread = {
|
||||||
|
Class = RealTimeThread
|
||||||
|
Functions = { }
|
||||||
|
|
||||||
|
}
|
||||||
|
`
|
||||||
|
lsp.Documents[uri] = content
|
||||||
|
p := parser.NewParser(content)
|
||||||
|
cfg, _ := p.Parse()
|
||||||
|
lsp.Tree.AddFile(path, cfg)
|
||||||
|
|
||||||
|
// Position at line 4
|
||||||
|
params := lsp.CompletionParams{
|
||||||
|
TextDocument: lsp.TextDocumentIdentifier{URI: uri},
|
||||||
|
Position: lsp.Position{Line: 4, Character: 4},
|
||||||
|
}
|
||||||
|
|
||||||
|
list := lsp.HandleCompletion(params)
|
||||||
|
for _, item := range list.Items {
|
||||||
|
if item.Label == "Functions" || item.Label == "Class" {
|
||||||
|
t.Errorf("Did not expect already defined field %s in suggestions", item.Label)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Scope-aware suggestions", func(t *testing.T) {
|
||||||
|
setup()
|
||||||
|
// Define a project DataSource in one file
|
||||||
|
cfg1, _ := parser.NewParser("#package MYPROJ.Data\n+ProjectDS = { Class = FileReader +Signals = { S1 = { Type = int32 } } }").Parse()
|
||||||
|
lsp.Tree.AddFile("project_ds.marte", cfg1)
|
||||||
|
|
||||||
|
// Define an isolated file
|
||||||
|
contentIso := "+MyGAM = { Class = IOGAM +InputSignals = { S1 = { DataSource = } } }"
|
||||||
|
lsp.Documents["file://iso.marte"] = contentIso
|
||||||
|
cfg2, _ := parser.NewParser(contentIso).Parse()
|
||||||
|
lsp.Tree.AddFile("iso.marte", cfg2)
|
||||||
|
|
||||||
|
lsp.Tree.ResolveReferences()
|
||||||
|
|
||||||
|
// Completion in isolated file
|
||||||
|
params := lsp.CompletionParams{
|
||||||
|
TextDocument: lsp.TextDocumentIdentifier{URI: "file://iso.marte"},
|
||||||
|
Position: lsp.Position{Line: 0, Character: strings.Index(contentIso, "DataSource = ") + len("DataSource = ") + 1},
|
||||||
|
}
|
||||||
|
|
||||||
|
list := lsp.HandleCompletion(params)
|
||||||
|
foundProjectDS := false
|
||||||
|
if list != nil {
|
||||||
|
for _, item := range list.Items {
|
||||||
|
if item.Label == "ProjectDS" {
|
||||||
|
foundProjectDS = true
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if foundProjectDS {
|
||||||
|
t.Error("Did not expect ProjectDS in isolated file suggestions")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Completion in a project file
|
||||||
|
lineContent := "+MyGAM = { Class = IOGAM +InputSignals = { S1 = { DataSource = Dummy } } }"
|
||||||
|
contentPrj := "#package MYPROJ.App\n" + lineContent
|
||||||
|
lsp.Documents["file://prj.marte"] = contentPrj
|
||||||
|
pPrj := parser.NewParser(contentPrj)
|
||||||
|
cfg3, err := pPrj.Parse()
|
||||||
|
if err != nil {
|
||||||
|
t.Logf("Parser error in contentPrj: %v", err)
|
||||||
|
}
|
||||||
|
lsp.Tree.AddFile("prj.marte", cfg3)
|
||||||
|
lsp.Tree.ResolveReferences()
|
||||||
|
|
||||||
|
paramsPrj := lsp.CompletionParams{
|
||||||
|
TextDocument: lsp.TextDocumentIdentifier{URI: "file://prj.marte"},
|
||||||
|
Position: lsp.Position{Line: 1, Character: strings.Index(lineContent, "Dummy")},
|
||||||
|
}
|
||||||
|
|
||||||
|
listPrj := lsp.HandleCompletion(paramsPrj)
|
||||||
|
foundProjectDS = false
|
||||||
|
if listPrj != nil {
|
||||||
|
for _, item := range listPrj.Items {
|
||||||
|
if item.Label == "ProjectDS" {
|
||||||
|
foundProjectDS = true
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if !foundProjectDS {
|
||||||
|
t.Error("Expected ProjectDS in project file suggestions")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Suggest Signal Types", func(t *testing.T) {
|
||||||
|
setup()
|
||||||
|
content := `
|
||||||
|
+DS = {
|
||||||
|
Class = FileReader
|
||||||
|
Signals = {
|
||||||
|
S1 = { Type = }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
`
|
||||||
|
lsp.Documents[uri] = content
|
||||||
|
p := parser.NewParser(content)
|
||||||
|
cfg, _ := p.Parse()
|
||||||
|
lsp.Tree.AddFile(path, cfg)
|
||||||
|
|
||||||
|
params := lsp.CompletionParams{
|
||||||
|
TextDocument: lsp.TextDocumentIdentifier{URI: uri},
|
||||||
|
Position: lsp.Position{Line: 4, Character: strings.Index(content, "Type = ") + len("Type = ") + 1},
|
||||||
|
}
|
||||||
|
|
||||||
|
list := lsp.HandleCompletion(params)
|
||||||
|
if list == nil {
|
||||||
|
t.Fatal("Expected signal type suggestions")
|
||||||
|
}
|
||||||
|
|
||||||
|
foundUint32 := false
|
||||||
|
for _, item := range list.Items {
|
||||||
|
if item.Label == "uint32" {
|
||||||
|
foundUint32 = true
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if !foundUint32 {
|
||||||
|
t.Error("Expected uint32 in suggestions")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Suggest CUE Enums", func(t *testing.T) {
|
||||||
|
setup()
|
||||||
|
// Inject custom schema with enum
|
||||||
|
custom := []byte(`
|
||||||
|
package schema
|
||||||
|
#Classes: {
|
||||||
|
TestEnumClass: {
|
||||||
|
Mode: "Auto" | "Manual"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
`)
|
||||||
|
val := lsp.GlobalSchema.Context.CompileBytes(custom)
|
||||||
|
lsp.GlobalSchema.Value = lsp.GlobalSchema.Value.Unify(val)
|
||||||
|
|
||||||
|
content := `
|
||||||
|
+Obj = {
|
||||||
|
Class = TestEnumClass
|
||||||
|
Mode =
|
||||||
|
}
|
||||||
|
`
|
||||||
|
lsp.Documents[uri] = content
|
||||||
|
p := parser.NewParser(content)
|
||||||
|
cfg, _ := p.Parse()
|
||||||
|
lsp.Tree.AddFile(path, cfg)
|
||||||
|
|
||||||
|
params := lsp.CompletionParams{
|
||||||
|
TextDocument: lsp.TextDocumentIdentifier{URI: uri},
|
||||||
|
Position: lsp.Position{Line: 3, Character: strings.Index(content, "Mode = ") + len("Mode = ") + 1},
|
||||||
|
}
|
||||||
|
|
||||||
|
list := lsp.HandleCompletion(params)
|
||||||
|
if list == nil {
|
||||||
|
t.Fatal("Expected enum suggestions")
|
||||||
|
}
|
||||||
|
|
||||||
|
foundAuto := false
|
||||||
|
for _, item := range list.Items {
|
||||||
|
if item.Label == "\"Auto\"" { // CUE string value includes quotes
|
||||||
|
foundAuto = true
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if !foundAuto {
|
||||||
|
// Check if it returned without quotes?
|
||||||
|
// v.String() returns quoted for string.
|
||||||
|
t.Error("Expected \"Auto\" in suggestions")
|
||||||
|
for _, item := range list.Items {
|
||||||
|
t.Logf("Suggestion: %s", item.Label)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
@@ -3,8 +3,8 @@ package integration
|
|||||||
import (
|
import (
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestLSPHoverDoc(t *testing.T) {
|
func TestLSPHoverDoc(t *testing.T) {
|
||||||
|
|||||||
@@ -3,8 +3,8 @@ package integration
|
|||||||
import (
|
import (
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestGetNodeContaining(t *testing.T) {
|
func TestGetNodeContaining(t *testing.T) {
|
||||||
|
|||||||
199
test/lsp_server_test.go
Normal file
199
test/lsp_server_test.go
Normal file
@@ -0,0 +1,199 @@
|
|||||||
|
package integration
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/json"
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
|
"github.com/marte-community/marte-dev-tools/internal/lsp"
|
||||||
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestInitProjectScan(t *testing.T) {
|
||||||
|
// 1. Setup temp dir with files
|
||||||
|
tmpDir, err := os.MkdirTemp("", "lsp_test")
|
||||||
|
if err != nil {
|
||||||
|
t.Fatal(err)
|
||||||
|
}
|
||||||
|
defer os.RemoveAll(tmpDir)
|
||||||
|
|
||||||
|
// File 1: Definition
|
||||||
|
if err := os.WriteFile(filepath.Join(tmpDir, "def.marte"), []byte("#package Test.Common\n+Target = { Class = C }"), 0644); err != nil {
|
||||||
|
t.Fatal(err)
|
||||||
|
}
|
||||||
|
// File 2: Reference
|
||||||
|
if err := os.WriteFile(filepath.Join(tmpDir, "ref.marte"), []byte("#package Test.Common\n+Source = { Class = C Link = Target }"), 0644); err != nil {
|
||||||
|
t.Fatal(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// 2. Initialize
|
||||||
|
lsp.Tree = index.NewProjectTree() // Reset global tree
|
||||||
|
|
||||||
|
initParams := lsp.InitializeParams{RootPath: tmpDir}
|
||||||
|
paramsBytes, _ := json.Marshal(initParams)
|
||||||
|
|
||||||
|
msg := &lsp.JsonRpcMessage{
|
||||||
|
Method: "initialize",
|
||||||
|
Params: paramsBytes,
|
||||||
|
ID: 1,
|
||||||
|
}
|
||||||
|
|
||||||
|
lsp.HandleMessage(msg)
|
||||||
|
|
||||||
|
// Query the reference in ref.marte at "Target"
|
||||||
|
defParams := lsp.DefinitionParams{
|
||||||
|
TextDocument: lsp.TextDocumentIdentifier{URI: "file://" + filepath.Join(tmpDir, "ref.marte")},
|
||||||
|
Position: lsp.Position{Line: 1, Character: 29},
|
||||||
|
}
|
||||||
|
|
||||||
|
res := lsp.HandleDefinition(defParams)
|
||||||
|
if res == nil {
|
||||||
|
t.Fatal("Definition not found via LSP after initialization")
|
||||||
|
}
|
||||||
|
|
||||||
|
locs, ok := res.([]lsp.Location)
|
||||||
|
if !ok {
|
||||||
|
t.Fatalf("Expected []lsp.Location, got %T", res)
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(locs) == 0 {
|
||||||
|
t.Fatal("No locations found")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify uri points to def.marte
|
||||||
|
expectedURI := "file://" + filepath.Join(tmpDir, "def.marte")
|
||||||
|
if locs[0].URI != expectedURI {
|
||||||
|
t.Errorf("Expected URI %s, got %s", expectedURI, locs[0].URI)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestHandleDefinition(t *testing.T) {
|
||||||
|
// Reset tree for test
|
||||||
|
lsp.Tree = index.NewProjectTree()
|
||||||
|
|
||||||
|
content := `
|
||||||
|
+MyObject = {
|
||||||
|
Class = Type
|
||||||
|
}
|
||||||
|
+RefObject = {
|
||||||
|
Class = Type
|
||||||
|
RefField = MyObject
|
||||||
|
}
|
||||||
|
`
|
||||||
|
path := "/test.marte"
|
||||||
|
p := parser.NewParser(content)
|
||||||
|
config, err := p.Parse()
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Parse failed: %v", err)
|
||||||
|
}
|
||||||
|
lsp.Tree.AddFile(path, config)
|
||||||
|
lsp.Tree.ResolveReferences()
|
||||||
|
|
||||||
|
t.Logf("Refs: %d", len(lsp.Tree.References))
|
||||||
|
for _, r := range lsp.Tree.References {
|
||||||
|
t.Logf(" %s at %d:%d", r.Name, r.Position.Line, r.Position.Column)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Test Go to Definition on MyObject reference
|
||||||
|
params := lsp.DefinitionParams{
|
||||||
|
TextDocument: lsp.TextDocumentIdentifier{URI: "file://" + path},
|
||||||
|
Position: lsp.Position{Line: 6, Character: 15}, // "MyObject" in RefField = MyObject
|
||||||
|
}
|
||||||
|
|
||||||
|
result := lsp.HandleDefinition(params)
|
||||||
|
if result == nil {
|
||||||
|
t.Fatal("HandleDefinition returned nil")
|
||||||
|
}
|
||||||
|
|
||||||
|
locations, ok := result.([]lsp.Location)
|
||||||
|
if !ok {
|
||||||
|
t.Fatalf("Expected []lsp.Location, got %T", result)
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(locations) != 1 {
|
||||||
|
t.Fatalf("Expected 1 location, got %d", len(locations))
|
||||||
|
}
|
||||||
|
|
||||||
|
if locations[0].Range.Start.Line != 1 { // +MyObject is on line 2 (0-indexed 1)
|
||||||
|
t.Errorf("Expected definition on line 1, got %d", locations[0].Range.Start.Line)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestHandleReferences(t *testing.T) {
|
||||||
|
// Reset tree for test
|
||||||
|
lsp.Tree = index.NewProjectTree()
|
||||||
|
|
||||||
|
content := `
|
||||||
|
+MyObject = {
|
||||||
|
Class = Type
|
||||||
|
}
|
||||||
|
+RefObject = {
|
||||||
|
Class = Type
|
||||||
|
RefField = MyObject
|
||||||
|
}
|
||||||
|
+AnotherRef = {
|
||||||
|
Ref = MyObject
|
||||||
|
}
|
||||||
|
`
|
||||||
|
path := "/test.marte"
|
||||||
|
p := parser.NewParser(content)
|
||||||
|
config, err := p.Parse()
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Parse failed: %v", err)
|
||||||
|
}
|
||||||
|
lsp.Tree.AddFile(path, config)
|
||||||
|
lsp.Tree.ResolveReferences()
|
||||||
|
|
||||||
|
// Test Find References for MyObject (triggered from its definition)
|
||||||
|
params := lsp.ReferenceParams{
|
||||||
|
TextDocument: lsp.TextDocumentIdentifier{URI: "file://" + path},
|
||||||
|
Position: lsp.Position{Line: 1, Character: 1}, // "+MyObject"
|
||||||
|
Context: lsp.ReferenceContext{IncludeDeclaration: true},
|
||||||
|
}
|
||||||
|
|
||||||
|
locations := lsp.HandleReferences(params)
|
||||||
|
if len(locations) != 3 { // 1 declaration + 2 references
|
||||||
|
t.Fatalf("Expected 3 locations, got %d", len(locations))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestLSPFormatting(t *testing.T) {
|
||||||
|
// Setup
|
||||||
|
content := `
|
||||||
|
#package Proj.Main
|
||||||
|
+Object={
|
||||||
|
Field=1
|
||||||
|
}
|
||||||
|
`
|
||||||
|
uri := "file:///test.marte"
|
||||||
|
|
||||||
|
// Open (populate Documents map)
|
||||||
|
lsp.Documents[uri] = content
|
||||||
|
|
||||||
|
// Format
|
||||||
|
params := lsp.DocumentFormattingParams{
|
||||||
|
TextDocument: lsp.TextDocumentIdentifier{URI: uri},
|
||||||
|
}
|
||||||
|
|
||||||
|
edits := lsp.HandleFormatting(params)
|
||||||
|
|
||||||
|
if len(edits) != 1 {
|
||||||
|
t.Fatalf("Expected 1 edit, got %d", len(edits))
|
||||||
|
}
|
||||||
|
|
||||||
|
newText := edits[0].NewText
|
||||||
|
|
||||||
|
expected := `#package Proj.Main
|
||||||
|
|
||||||
|
+Object = {
|
||||||
|
Field = 1
|
||||||
|
}
|
||||||
|
`
|
||||||
|
// Normalize newlines for comparison just in case
|
||||||
|
if strings.TrimSpace(strings.ReplaceAll(newText, "\r\n", "\n")) != strings.TrimSpace(strings.ReplaceAll(expected, "\r\n", "\n")) {
|
||||||
|
t.Errorf("Formatting mismatch.\nExpected:\n%s\nGot:\n%s", expected, newText)
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -3,9 +3,9 @@ package integration
|
|||||||
import (
|
import (
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestLSPSignalReferences(t *testing.T) {
|
func TestLSPSignalReferences(t *testing.T) {
|
||||||
@@ -52,13 +52,19 @@ func TestLSPSignalReferences(t *testing.T) {
|
|||||||
|
|
||||||
// Traverse to MySig
|
// Traverse to MySig
|
||||||
dataNode := root.Children["Data"]
|
dataNode := root.Children["Data"]
|
||||||
if dataNode == nil { t.Fatal("Data node not found") }
|
if dataNode == nil {
|
||||||
|
t.Fatal("Data node not found")
|
||||||
|
}
|
||||||
|
|
||||||
myDS := dataNode.Children["MyDS"]
|
myDS := dataNode.Children["MyDS"]
|
||||||
if myDS == nil { t.Fatal("MyDS node not found") }
|
if myDS == nil {
|
||||||
|
t.Fatal("MyDS node not found")
|
||||||
|
}
|
||||||
|
|
||||||
signals := myDS.Children["Signals"]
|
signals := myDS.Children["Signals"]
|
||||||
if signals == nil { t.Fatal("Signals node not found") }
|
if signals == nil {
|
||||||
|
t.Fatal("Signals node not found")
|
||||||
|
}
|
||||||
|
|
||||||
mySigDef := signals.Children["MySig"]
|
mySigDef := signals.Children["MySig"]
|
||||||
if mySigDef == nil {
|
if mySigDef == nil {
|
||||||
|
|||||||
@@ -4,9 +4,9 @@ import (
|
|||||||
"io/ioutil"
|
"io/ioutil"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
// Helper to load and parse a file
|
// Helper to load and parse a file
|
||||||
|
|||||||
@@ -1,9 +1,9 @@
|
|||||||
package parser_test
|
package integration
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestParserStrictness(t *testing.T) {
|
func TestParserStrictness(t *testing.T) {
|
||||||
@@ -1,7 +1,9 @@
|
|||||||
package parser
|
package integration
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestParseBasic(t *testing.T) {
|
func TestParseBasic(t *testing.T) {
|
||||||
@@ -22,7 +24,7 @@ $Node2 = {
|
|||||||
Array = {1 2 3}
|
Array = {1 2 3}
|
||||||
}
|
}
|
||||||
`
|
`
|
||||||
p := NewParser(input)
|
p := parser.NewParser(input)
|
||||||
config, err := p.Parse()
|
config, err := p.Parse()
|
||||||
if err != nil {
|
if err != nil {
|
||||||
t.Fatalf("Parse error: %v", err)
|
t.Fatalf("Parse error: %v", err)
|
||||||
@@ -4,9 +4,9 @@ import (
|
|||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestMDSWriterValidation(t *testing.T) {
|
func TestMDSWriterValidation(t *testing.T) {
|
||||||
@@ -38,7 +38,7 @@ func TestMDSWriterValidation(t *testing.T) {
|
|||||||
|
|
||||||
found := false
|
found := false
|
||||||
for _, d := range v.Diagnostics {
|
for _, d := range v.Diagnostics {
|
||||||
if strings.Contains(d.Message, "Missing mandatory field 'TreeName'") {
|
if strings.Contains(d.Message, "TreeName: incomplete value") {
|
||||||
found = true
|
found = true
|
||||||
break
|
break
|
||||||
}
|
}
|
||||||
@@ -71,7 +71,7 @@ func TestMathExpressionGAMValidation(t *testing.T) {
|
|||||||
|
|
||||||
found := false
|
found := false
|
||||||
for _, d := range v.Diagnostics {
|
for _, d := range v.Diagnostics {
|
||||||
if strings.Contains(d.Message, "Missing mandatory field 'Expression'") {
|
if strings.Contains(d.Message, "Expression: incomplete value") {
|
||||||
found = true
|
found = true
|
||||||
break
|
break
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -4,9 +4,9 @@ import (
|
|||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestPIDGAMValidation(t *testing.T) {
|
func TestPIDGAMValidation(t *testing.T) {
|
||||||
@@ -35,10 +35,10 @@ func TestPIDGAMValidation(t *testing.T) {
|
|||||||
foundKd := false
|
foundKd := false
|
||||||
|
|
||||||
for _, d := range v.Diagnostics {
|
for _, d := range v.Diagnostics {
|
||||||
if strings.Contains(d.Message, "Missing mandatory field 'Ki'") {
|
if strings.Contains(d.Message, "Ki: incomplete value") {
|
||||||
foundKi = true
|
foundKi = true
|
||||||
}
|
}
|
||||||
if strings.Contains(d.Message, "Missing mandatory field 'Kd'") {
|
if strings.Contains(d.Message, "Kd: incomplete value") {
|
||||||
foundKd = true
|
foundKd = true
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -73,7 +73,7 @@ func TestFileDataSourceValidation(t *testing.T) {
|
|||||||
|
|
||||||
found := false
|
found := false
|
||||||
for _, d := range v.Diagnostics {
|
for _, d := range v.Diagnostics {
|
||||||
if strings.Contains(d.Message, "Missing mandatory field 'Filename'") {
|
if strings.Contains(d.Message, "Filename: incomplete value") {
|
||||||
found = true
|
found = true
|
||||||
break
|
break
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -4,9 +4,9 @@ import (
|
|||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestRealTimeApplicationValidation(t *testing.T) {
|
func TestRealTimeApplicationValidation(t *testing.T) {
|
||||||
@@ -35,14 +35,20 @@ func TestRealTimeApplicationValidation(t *testing.T) {
|
|||||||
missingStates := false
|
missingStates := false
|
||||||
|
|
||||||
for _, d := range v.Diagnostics {
|
for _, d := range v.Diagnostics {
|
||||||
if strings.Contains(d.Message, "Missing mandatory field 'Data'") {
|
if strings.Contains(d.Message, "Data: field is required") {
|
||||||
missingData = true
|
missingData = true
|
||||||
}
|
}
|
||||||
if strings.Contains(d.Message, "Missing mandatory field 'States'") {
|
if strings.Contains(d.Message, "States: field is required") {
|
||||||
missingStates = true
|
missingStates = true
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if !missingData || !missingStates {
|
||||||
|
for _, d := range v.Diagnostics {
|
||||||
|
t.Logf("Diagnostic: %s", d.Message)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
if !missingData {
|
if !missingData {
|
||||||
t.Error("Expected error for missing 'Data' field in RealTimeApplication")
|
t.Error("Expected error for missing 'Data' field in RealTimeApplication")
|
||||||
}
|
}
|
||||||
@@ -73,7 +79,7 @@ func TestGAMSchedulerValidation(t *testing.T) {
|
|||||||
|
|
||||||
found := false
|
found := false
|
||||||
for _, d := range v.Diagnostics {
|
for _, d := range v.Diagnostics {
|
||||||
if strings.Contains(d.Message, "Missing mandatory field 'TimingDataSource'") {
|
if strings.Contains(d.Message, "TimingDataSource: incomplete value") {
|
||||||
found = true
|
found = true
|
||||||
break
|
break
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -4,9 +4,9 @@ import (
|
|||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestSDNSubscriberValidation(t *testing.T) {
|
func TestSDNSubscriberValidation(t *testing.T) {
|
||||||
@@ -32,7 +32,7 @@ func TestSDNSubscriberValidation(t *testing.T) {
|
|||||||
|
|
||||||
found := false
|
found := false
|
||||||
for _, d := range v.Diagnostics {
|
for _, d := range v.Diagnostics {
|
||||||
if strings.Contains(d.Message, "Missing mandatory field 'Port'") {
|
if strings.Contains(d.Message, "Port: incomplete value") {
|
||||||
found = true
|
found = true
|
||||||
break
|
break
|
||||||
}
|
}
|
||||||
@@ -65,7 +65,7 @@ func TestFileWriterValidation(t *testing.T) {
|
|||||||
|
|
||||||
found := false
|
found := false
|
||||||
for _, d := range v.Diagnostics {
|
for _, d := range v.Diagnostics {
|
||||||
if strings.Contains(d.Message, "Missing mandatory field 'Filename'") {
|
if strings.Contains(d.Message, "Filename: incomplete value") {
|
||||||
found = true
|
found = true
|
||||||
break
|
break
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -4,9 +4,9 @@ import (
|
|||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestFunctionsArrayValidation(t *testing.T) {
|
func TestFunctionsArrayValidation(t *testing.T) {
|
||||||
@@ -49,13 +49,13 @@ func TestFunctionsArrayValidation(t *testing.T) {
|
|||||||
|
|
||||||
for _, d := range v.Diagnostics {
|
for _, d := range v.Diagnostics {
|
||||||
if strings.Contains(d.Message, "not found or is not a valid GAM") {
|
if strings.Contains(d.Message, "not found or is not a valid GAM") {
|
||||||
// This covers both InvalidGAM and MissingGAM cases
|
// This covers both InvalidGAM and MissingGAM cases
|
||||||
if strings.Contains(d.Message, "InvalidGAM") {
|
if strings.Contains(d.Message, "InvalidGAM") {
|
||||||
foundInvalid = true
|
foundInvalid = true
|
||||||
}
|
}
|
||||||
if strings.Contains(d.Message, "MissingGAM") {
|
if strings.Contains(d.Message, "MissingGAM") {
|
||||||
foundMissing = true
|
foundMissing = true
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
if strings.Contains(d.Message, "must contain references") {
|
if strings.Contains(d.Message, "must contain references") {
|
||||||
foundNotRef = true
|
foundNotRef = true
|
||||||
|
|||||||
85
test/validator_gam_direction_test.go
Normal file
85
test/validator_gam_direction_test.go
Normal file
@@ -0,0 +1,85 @@
|
|||||||
|
package integration
|
||||||
|
|
||||||
|
import (
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestGAMSignalDirectionality(t *testing.T) {
|
||||||
|
content := `
|
||||||
|
$App = {
|
||||||
|
$Data = {
|
||||||
|
+InDS = { Class = FileReader Filename="f" +Signals = { S1 = { Type = uint32 } } }
|
||||||
|
+OutDS = { Class = FileWriter Filename="f" +Signals = { S1 = { Type = uint32 } } }
|
||||||
|
+InOutDS = { Class = FileDataSource Filename="f" +Signals = { S1 = { Type = uint32 } } }
|
||||||
|
}
|
||||||
|
+ValidGAM = {
|
||||||
|
Class = IOGAM
|
||||||
|
InputSignals = {
|
||||||
|
S1 = { DataSource = InDS }
|
||||||
|
S2 = { DataSource = InOutDS Alias = S1 }
|
||||||
|
}
|
||||||
|
OutputSignals = {
|
||||||
|
S3 = { DataSource = OutDS Alias = S1 }
|
||||||
|
S4 = { DataSource = InOutDS Alias = S1 }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
+InvalidGAM = {
|
||||||
|
Class = IOGAM
|
||||||
|
InputSignals = {
|
||||||
|
BadIn = { DataSource = OutDS Alias = S1 }
|
||||||
|
}
|
||||||
|
OutputSignals = {
|
||||||
|
BadOut = { DataSource = InDS Alias = S1 }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
`
|
||||||
|
p := parser.NewParser(content)
|
||||||
|
config, err := p.Parse()
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Parse failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
idx := index.NewProjectTree()
|
||||||
|
idx.AddFile("dir.marte", config)
|
||||||
|
idx.ResolveReferences()
|
||||||
|
|
||||||
|
v := validator.NewValidator(idx, ".")
|
||||||
|
v.ValidateProject()
|
||||||
|
|
||||||
|
// Check ValidGAM has NO directionality errors
|
||||||
|
for _, d := range v.Diagnostics {
|
||||||
|
if strings.Contains(d.Message, "is Output-only but referenced in InputSignals") ||
|
||||||
|
strings.Contains(d.Message, "is Input-only but referenced in OutputSignals") {
|
||||||
|
if strings.Contains(d.Message, "ValidGAM") {
|
||||||
|
t.Errorf("Unexpected direction error for ValidGAM: %s", d.Message)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check InvalidGAM HAS errors
|
||||||
|
foundBadIn := false
|
||||||
|
foundBadOut := false
|
||||||
|
for _, d := range v.Diagnostics {
|
||||||
|
if strings.Contains(d.Message, "InvalidGAM") {
|
||||||
|
if strings.Contains(d.Message, "is Output-only but referenced in InputSignals") {
|
||||||
|
foundBadIn = true
|
||||||
|
}
|
||||||
|
if strings.Contains(d.Message, "is Input-only but referenced in OutputSignals") {
|
||||||
|
foundBadOut = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if !foundBadIn {
|
||||||
|
t.Error("Expected error for OutDS in InputSignals of InvalidGAM")
|
||||||
|
}
|
||||||
|
if !foundBadOut {
|
||||||
|
t.Error("Expected error for InDS in OutputSignals of InvalidGAM")
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -3,9 +3,9 @@ package integration
|
|||||||
import (
|
import (
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestGAMSignalLinking(t *testing.T) {
|
func TestGAMSignalLinking(t *testing.T) {
|
||||||
|
|||||||
@@ -4,9 +4,9 @@ import (
|
|||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestGAMSignalValidation(t *testing.T) {
|
func TestGAMSignalValidation(t *testing.T) {
|
||||||
@@ -82,7 +82,7 @@ func TestGAMSignalValidation(t *testing.T) {
|
|||||||
if strings.Contains(d.Message, "DataSource 'OutDS' (Class FileWriter) is Output-only but referenced in InputSignals") {
|
if strings.Contains(d.Message, "DataSource 'OutDS' (Class FileWriter) is Output-only but referenced in InputSignals") {
|
||||||
foundBadInput = true
|
foundBadInput = true
|
||||||
}
|
}
|
||||||
if strings.Contains(d.Message, "Signal 'MissingSig' not found in DataSource 'InDS'") {
|
if strings.Contains(d.Message, "Implicitly Defined Signal: 'MissingSig'") {
|
||||||
foundMissing = true
|
foundMissing = true
|
||||||
}
|
}
|
||||||
if strings.Contains(d.Message, "DataSource 'InDS' (Class FileReader) is Input-only but referenced in OutputSignals") {
|
if strings.Contains(d.Message, "DataSource 'InDS' (Class FileReader) is Input-only but referenced in OutputSignals") {
|
||||||
@@ -91,10 +91,10 @@ func TestGAMSignalValidation(t *testing.T) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
if !foundBadInput || !foundMissing || !foundBadOutput {
|
if !foundBadInput || !foundMissing || !foundBadOutput {
|
||||||
for _, d := range v.Diagnostics {
|
for _, d := range v.Diagnostics {
|
||||||
t.Logf("Diagnostic: %s", d.Message)
|
t.Logf("Diagnostic: %s", d.Message)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if !foundBadInput {
|
if !foundBadInput {
|
||||||
t.Error("Expected error for OutDS in InputSignals")
|
t.Error("Expected error for OutDS in InputSignals")
|
||||||
|
|||||||
@@ -4,9 +4,9 @@ import (
|
|||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestGlobalPragmaDebug(t *testing.T) {
|
func TestGlobalPragmaDebug(t *testing.T) {
|
||||||
@@ -22,22 +22,22 @@ func TestGlobalPragmaDebug(t *testing.T) {
|
|||||||
t.Fatalf("Parse failed: %v", err)
|
t.Fatalf("Parse failed: %v", err)
|
||||||
}
|
}
|
||||||
|
|
||||||
// Check if pragma parsed
|
// Check if pragma parsed
|
||||||
if len(config.Pragmas) == 0 {
|
if len(config.Pragmas) == 0 {
|
||||||
t.Fatal("Pragma not parsed")
|
t.Fatal("Pragma not parsed")
|
||||||
}
|
}
|
||||||
t.Logf("Parsed Pragma 0: %s", config.Pragmas[0].Text)
|
t.Logf("Parsed Pragma 0: %s", config.Pragmas[0].Text)
|
||||||
|
|
||||||
idx := index.NewProjectTree()
|
idx := index.NewProjectTree()
|
||||||
idx.AddFile("debug.marte", config)
|
idx.AddFile("debug.marte", config)
|
||||||
idx.ResolveReferences()
|
idx.ResolveReferences()
|
||||||
|
|
||||||
// Check if added to GlobalPragmas
|
// Check if added to GlobalPragmas
|
||||||
pragmas, ok := idx.GlobalPragmas["debug.marte"]
|
pragmas, ok := idx.GlobalPragmas["debug.marte"]
|
||||||
if !ok || len(pragmas) == 0 {
|
if !ok || len(pragmas) == 0 {
|
||||||
t.Fatal("GlobalPragmas not populated")
|
t.Fatal("GlobalPragmas not populated")
|
||||||
}
|
}
|
||||||
t.Logf("Global Pragma stored: %s", pragmas[0])
|
t.Logf("Global Pragma stored: %s", pragmas[0])
|
||||||
|
|
||||||
v := validator.NewValidator(idx, ".")
|
v := validator.NewValidator(idx, ".")
|
||||||
v.ValidateProject()
|
v.ValidateProject()
|
||||||
@@ -48,11 +48,11 @@ func TestGlobalPragmaDebug(t *testing.T) {
|
|||||||
for _, d := range v.Diagnostics {
|
for _, d := range v.Diagnostics {
|
||||||
if strings.Contains(d.Message, "Implicitly Defined Signal") {
|
if strings.Contains(d.Message, "Implicitly Defined Signal") {
|
||||||
foundImplicitWarning = true
|
foundImplicitWarning = true
|
||||||
t.Logf("Found warning: %s", d.Message)
|
t.Logf("Found warning: %s", d.Message)
|
||||||
}
|
}
|
||||||
if strings.Contains(d.Message, "Unused GAM") {
|
if strings.Contains(d.Message, "Unused GAM") {
|
||||||
foundUnusedWarning = true
|
foundUnusedWarning = true
|
||||||
t.Logf("Found warning: %s", d.Message)
|
t.Logf("Found warning: %s", d.Message)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -4,9 +4,9 @@ import (
|
|||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestGlobalPragma(t *testing.T) {
|
func TestGlobalPragma(t *testing.T) {
|
||||||
|
|||||||
@@ -4,9 +4,9 @@ import (
|
|||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestGlobalPragmaUpdate(t *testing.T) {
|
func TestGlobalPragmaUpdate(t *testing.T) {
|
||||||
|
|||||||
@@ -4,9 +4,9 @@ import (
|
|||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestIgnorePragma(t *testing.T) {
|
func TestIgnorePragma(t *testing.T) {
|
||||||
|
|||||||
@@ -4,9 +4,9 @@ import (
|
|||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestImplicitSignal(t *testing.T) {
|
func TestImplicitSignal(t *testing.T) {
|
||||||
@@ -64,10 +64,10 @@ func TestImplicitSignal(t *testing.T) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
if !foundWarning || foundError {
|
if !foundWarning || foundError {
|
||||||
for _, d := range v.Diagnostics {
|
for _, d := range v.Diagnostics {
|
||||||
t.Logf("Diagnostic: %s", d.Message)
|
t.Logf("Diagnostic: %s", d.Message)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if !foundWarning {
|
if !foundWarning {
|
||||||
t.Error("Expected warning for ImplicitSig")
|
t.Error("Expected warning for ImplicitSig")
|
||||||
@@ -83,9 +83,9 @@ func TestImplicitSignal(t *testing.T) {
|
|||||||
`
|
`
|
||||||
p2 := parser.NewParser(contentMissingType)
|
p2 := parser.NewParser(contentMissingType)
|
||||||
config2, err2 := p2.Parse()
|
config2, err2 := p2.Parse()
|
||||||
if err2 != nil {
|
if err2 != nil {
|
||||||
t.Fatalf("Parse2 failed: %v", err2)
|
t.Fatalf("Parse2 failed: %v", err2)
|
||||||
}
|
}
|
||||||
idx2 := index.NewProjectTree()
|
idx2 := index.NewProjectTree()
|
||||||
idx2.AddFile("missing_type.marte", config2)
|
idx2.AddFile("missing_type.marte", config2)
|
||||||
idx2.ResolveReferences()
|
idx2.ResolveReferences()
|
||||||
@@ -99,9 +99,9 @@ func TestImplicitSignal(t *testing.T) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
if !foundTypeErr {
|
if !foundTypeErr {
|
||||||
for _, d := range v2.Diagnostics {
|
for _, d := range v2.Diagnostics {
|
||||||
t.Logf("Diagnostic2: %s", d.Message)
|
t.Logf("Diagnostic2: %s", d.Message)
|
||||||
}
|
}
|
||||||
t.Error("Expected error for missing Type in implicit signal")
|
t.Error("Expected error for missing Type in implicit signal")
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -5,9 +5,9 @@ import (
|
|||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
func parseAndAddToIndex(t *testing.T, idx *index.ProjectTree, filePath string) {
|
func parseAndAddToIndex(t *testing.T, idx *index.ProjectTree, filePath string) {
|
||||||
@@ -32,18 +32,18 @@ func TestMultiFileNodeValidation(t *testing.T) {
|
|||||||
|
|
||||||
// Resolving references might be needed if the validator relies on it for merging implicitly
|
// Resolving references might be needed if the validator relies on it for merging implicitly
|
||||||
// But primarily we want to check if the validator sees the merged node.
|
// But primarily we want to check if the validator sees the merged node.
|
||||||
// The current implementation of Validator likely iterates over the ProjectTree.
|
// The current implementation of Validator likely iterates over the ProjectTree.
|
||||||
// If the ProjectTree doesn't merge nodes automatically, the Validator needs to do it.
|
// If the ProjectTree doesn't merge nodes automatically, the Validator needs to do it.
|
||||||
// However, the spec says "The build tool, validator, and LSP must merge these definitions".
|
// However, the spec says "The build tool, validator, and LSP must merge these definitions".
|
||||||
// Let's assume the Validator or Index does the merging logic.
|
// Let's assume the Validator or Index does the merging logic.
|
||||||
|
|
||||||
v := validator.NewValidator(idx, ".")
|
v := validator.NewValidator(idx, ".")
|
||||||
v.ValidateProject()
|
v.ValidateProject()
|
||||||
|
|
||||||
// +MyNode is split.
|
// +MyNode is split.
|
||||||
// valid_1 has FieldA
|
// valid_1 has FieldA
|
||||||
// valid_2 has Class and FieldB
|
// valid_2 has Class and FieldB
|
||||||
// If merging works, it should have a Class, so no error about missing Class.
|
// If merging works, it should have a Class, so no error about missing Class.
|
||||||
|
|
||||||
for _, diag := range v.Diagnostics {
|
for _, diag := range v.Diagnostics {
|
||||||
if strings.Contains(diag.Message, "must contain a 'Class' field") {
|
if strings.Contains(diag.Message, "must contain a 'Class' field") {
|
||||||
@@ -80,13 +80,13 @@ func TestMultiFileReference(t *testing.T) {
|
|||||||
|
|
||||||
idx.ResolveReferences()
|
idx.ResolveReferences()
|
||||||
|
|
||||||
// Check if the reference in +SourceNode to TargetNode is resolved.
|
// Check if the reference in +SourceNode to TargetNode is resolved.
|
||||||
v := validator.NewValidator(idx, ".")
|
v := validator.NewValidator(idx, ".")
|
||||||
v.ValidateProject()
|
v.ValidateProject()
|
||||||
|
|
||||||
if len(v.Diagnostics) > 0 {
|
if len(v.Diagnostics) > 0 {
|
||||||
// Filter out irrelevant errors
|
// Filter out irrelevant errors
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func TestHierarchicalPackageMerge(t *testing.T) {
|
func TestHierarchicalPackageMerge(t *testing.T) {
|
||||||
@@ -154,43 +154,43 @@ func TestHierarchicalDuplicate(t *testing.T) {
|
|||||||
func TestIsolatedFileValidation(t *testing.T) {
|
func TestIsolatedFileValidation(t *testing.T) {
|
||||||
idx := index.NewProjectTree()
|
idx := index.NewProjectTree()
|
||||||
|
|
||||||
// File 1: Has package. Defines SharedClass.
|
// File 1: Has package. Defines SharedClass.
|
||||||
f1Content := `
|
f1Content := `
|
||||||
#package Proj.Pkg
|
#package Proj.Pkg
|
||||||
+SharedObj = { Class = SharedClass }
|
+SharedObj = { Class = SharedClass }
|
||||||
`
|
`
|
||||||
p1 := parser.NewParser(f1Content)
|
p1 := parser.NewParser(f1Content)
|
||||||
c1, _ := p1.Parse()
|
c1, _ := p1.Parse()
|
||||||
idx.AddFile("shared.marte", c1)
|
idx.AddFile("shared.marte", c1)
|
||||||
|
|
||||||
// File 2: No package. References SharedObj.
|
// File 2: No package. References SharedObj.
|
||||||
// Should NOT resolve to SharedObj in shared.marte because iso.marte is isolated.
|
// Should NOT resolve to SharedObj in shared.marte because iso.marte is isolated.
|
||||||
f2Content := `
|
f2Content := `
|
||||||
+IsoObj = {
|
+IsoObj = {
|
||||||
Class = "MyClass"
|
Class = "MyClass"
|
||||||
Ref = SharedObj
|
Ref = SharedObj
|
||||||
}
|
}
|
||||||
`
|
`
|
||||||
p2 := parser.NewParser(f2Content)
|
p2 := parser.NewParser(f2Content)
|
||||||
c2, _ := p2.Parse()
|
c2, _ := p2.Parse()
|
||||||
idx.AddFile("iso.marte", c2)
|
idx.AddFile("iso.marte", c2)
|
||||||
|
|
||||||
idx.ResolveReferences()
|
idx.ResolveReferences()
|
||||||
|
|
||||||
// Find reference
|
// Find reference
|
||||||
var ref *index.Reference
|
var ref *index.Reference
|
||||||
for i := range idx.References {
|
for i := range idx.References {
|
||||||
if idx.References[i].File == "iso.marte" && idx.References[i].Name == "SharedObj" {
|
if idx.References[i].File == "iso.marte" && idx.References[i].Name == "SharedObj" {
|
||||||
ref = &idx.References[i]
|
ref = &idx.References[i]
|
||||||
break
|
break
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if ref == nil {
|
if ref == nil {
|
||||||
t.Fatal("Reference SharedObj not found in index")
|
t.Fatal("Reference SharedObj not found in index")
|
||||||
}
|
}
|
||||||
|
|
||||||
if ref.Target != nil {
|
if ref.Target != nil {
|
||||||
t.Errorf("Expected reference in isolated file to be unresolved, but got target in %s", ref.Target.Fragments[0].File)
|
t.Errorf("Expected reference in isolated file to be unresolved, but got target in %s", ref.Target.Fragments[0].File)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -4,9 +4,9 @@ import (
|
|||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestPragmaSuppression(t *testing.T) {
|
func TestPragmaSuppression(t *testing.T) {
|
||||||
|
|||||||
@@ -6,9 +6,9 @@ import (
|
|||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestProjectSpecificSchema(t *testing.T) {
|
func TestProjectSpecificSchema(t *testing.T) {
|
||||||
@@ -21,17 +21,16 @@ func TestProjectSpecificSchema(t *testing.T) {
|
|||||||
|
|
||||||
// Define project schema
|
// Define project schema
|
||||||
schemaContent := `
|
schemaContent := `
|
||||||
{
|
package schema
|
||||||
"classes": {
|
|
||||||
"ProjectClass": {
|
#Classes: {
|
||||||
"fields": [
|
ProjectClass: {
|
||||||
{"name": "CustomField", "type": "int", "mandatory": true}
|
CustomField: int
|
||||||
]
|
...
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
}
|
||||||
`
|
`
|
||||||
err = os.WriteFile(filepath.Join(tmpDir, ".marte_schema.json"), []byte(schemaContent), 0644)
|
err = os.WriteFile(filepath.Join(tmpDir, ".marte_schema.cue"), []byte(schemaContent), 0644)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
t.Fatal(err)
|
t.Fatal(err)
|
||||||
}
|
}
|
||||||
@@ -59,7 +58,7 @@ func TestProjectSpecificSchema(t *testing.T) {
|
|||||||
|
|
||||||
found := false
|
found := false
|
||||||
for _, d := range v.Diagnostics {
|
for _, d := range v.Diagnostics {
|
||||||
if strings.Contains(d.Message, "Missing mandatory field 'CustomField'") {
|
if strings.Contains(d.Message, "CustomField: incomplete value") {
|
||||||
found = true
|
found = true
|
||||||
break
|
break
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -4,44 +4,11 @@ import (
|
|||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestSchemaValidationMandatory(t *testing.T) {
|
|
||||||
// StateMachine requires "States"
|
|
||||||
content := `
|
|
||||||
+MySM = {
|
|
||||||
Class = StateMachine
|
|
||||||
// Missing States
|
|
||||||
}
|
|
||||||
`
|
|
||||||
p := parser.NewParser(content)
|
|
||||||
config, err := p.Parse()
|
|
||||||
if err != nil {
|
|
||||||
t.Fatalf("Parse failed: %v", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
idx := index.NewProjectTree()
|
|
||||||
idx.AddFile("test.marte", config)
|
|
||||||
|
|
||||||
v := validator.NewValidator(idx, ".")
|
|
||||||
v.ValidateProject()
|
|
||||||
|
|
||||||
found := false
|
|
||||||
for _, d := range v.Diagnostics {
|
|
||||||
if strings.Contains(d.Message, "Missing mandatory field 'States'") {
|
|
||||||
found = true
|
|
||||||
break
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if !found {
|
|
||||||
t.Error("Expected error for missing mandatory field 'States', but found none")
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func TestSchemaValidationType(t *testing.T) {
|
func TestSchemaValidationType(t *testing.T) {
|
||||||
// OrderedClass: First (int), Second (string)
|
// OrderedClass: First (int), Second (string)
|
||||||
content := `
|
content := `
|
||||||
@@ -65,7 +32,7 @@ func TestSchemaValidationType(t *testing.T) {
|
|||||||
|
|
||||||
found := false
|
found := false
|
||||||
for _, d := range v.Diagnostics {
|
for _, d := range v.Diagnostics {
|
||||||
if strings.Contains(d.Message, "Field 'First' expects type 'int'") {
|
if strings.Contains(d.Message, "mismatched types") {
|
||||||
found = true
|
found = true
|
||||||
break
|
break
|
||||||
}
|
}
|
||||||
@@ -105,8 +72,8 @@ func TestSchemaValidationOrder(t *testing.T) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if !found {
|
if found {
|
||||||
t.Error("Expected error for out-of-order fields, but found none")
|
t.Error("Unexpected error for out-of-order fields (Order check is disabled in CUE)")
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -4,9 +4,9 @@ import (
|
|||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestSignalProperties(t *testing.T) {
|
func TestSignalProperties(t *testing.T) {
|
||||||
|
|||||||
@@ -4,9 +4,9 @@ import (
|
|||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestSignalValidation(t *testing.T) {
|
func TestSignalValidation(t *testing.T) {
|
||||||
|
|||||||
@@ -4,9 +4,9 @@ import (
|
|||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestSignalsContentValidation(t *testing.T) {
|
func TestSignalsContentValidation(t *testing.T) {
|
||||||
|
|||||||
@@ -3,9 +3,9 @@ package integration
|
|||||||
import (
|
import (
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/index"
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/parser"
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
"github.com/marte-dev/marte-dev-tools/internal/validator"
|
"github.com/marte-community/marte-dev-tools/internal/validator"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestUnusedGAM(t *testing.T) {
|
func TestUnusedGAM(t *testing.T) {
|
||||||
@@ -63,8 +63,10 @@ $App = {
|
|||||||
$Data = {
|
$Data = {
|
||||||
+MyDS = {
|
+MyDS = {
|
||||||
Class = DataSourceClass
|
Class = DataSourceClass
|
||||||
Sig1 = { Type = uint32 }
|
+Signals = {
|
||||||
Sig2 = { Type = uint32 }
|
Sig1 = { Type = uint32 }
|
||||||
|
Sig2 = { Type = uint32 }
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
Reference in New Issue
Block a user