Compare commits

...

30 Commits

Author SHA1 Message Date
Martino Ferrari
0ffcecf19e simple makefile 2026-01-23 14:30:17 +01:00
Martino Ferrari
761cf83b8e Added *.out rule 2026-01-23 14:30:02 +01:00
Martino Ferrari
7caf3a5da5 Renamed files 2026-01-23 14:24:43 +01:00
Martino Ferrari
94ee7e4880 added support to enum in completion 2026-01-23 14:18:41 +01:00
Martino Ferrari
ce9b68200e More tests 2026-01-23 14:09:17 +01:00
Martino Ferrari
e3c84fcf60 Moved tests in test folder (and made methods public in server.go) 2026-01-23 14:04:24 +01:00
Martino Ferrari
4a515fd6c3 completion test 2026-01-23 14:01:35 +01:00
Martino Ferrari
14cba1b530 Working 2026-01-23 14:01:26 +01:00
Martino Ferrari
462c832651 improved suggestions 2026-01-23 13:20:22 +01:00
Martino Ferrari
77fe3e9cac Improved LSP reactivity 2026-01-23 13:14:34 +01:00
Martino Ferrari
0ee44c0a27 Readme file added 2026-01-23 13:02:53 +01:00
Martino Ferrari
d450d358b4 add MIT Licensing 2026-01-23 13:02:34 +01:00
Martino Ferrari
2cdcfe2812 Updated specifications 2026-01-23 13:02:12 +01:00
Martino Ferrari
ef7729475a Implemented auto completion 2026-01-23 12:01:35 +01:00
Martino Ferrari
99bd5bffdd Changed project uri 2026-01-23 11:46:59 +01:00
Martino Ferrari
4379960835 Removed wrong test 2026-01-23 11:42:34 +01:00
Martino Ferrari
2aeec1e5f6 better validation of statemachine 2026-01-23 11:42:29 +01:00
Martino Ferrari
5853365707 Moved to CUE validation 2026-01-23 11:16:06 +01:00
Martino Ferrari
5c3f05a1a4 implemented ordering preservation 2026-01-23 10:23:02 +01:00
Martino Ferrari
e2c87c90f3 removed executable 2026-01-23 09:44:04 +01:00
Martino Ferrari
1ea518a58a minor improvment in the hover doc 2026-01-22 13:38:47 +01:00
Martino Ferrari
0654062d08 Almost done 2026-01-22 03:55:00 +01:00
Martino Ferrari
a88f833f49 Improving parsing and specs 2026-01-22 03:15:42 +01:00
Martino Ferrari
b2e963fc04 Implementing pragmas 2026-01-22 02:51:36 +01:00
Martino Ferrari
8fe319de2d Pragma and signal validation added 2026-01-22 02:29:54 +01:00
Martino Ferrari
93d48bd3ed mostly good 2026-01-22 02:19:14 +01:00
Martino Ferrari
164dad896c better indexing 2026-01-22 01:53:50 +01:00
Martino Ferrari
f111bf1aaa better indexing 2026-01-22 01:53:45 +01:00
Martino Ferrari
4a624aa929 better indexing 2026-01-22 01:26:24 +01:00
Martino Ferrari
5b0834137b not bad 2026-01-22 01:26:17 +01:00
53 changed files with 10334 additions and 1092 deletions

2
.gitignore vendored
View File

@@ -1,2 +1,4 @@
build build
*.log *.log
mdt
*.out

21
LICENSE Normal file
View File

@@ -0,0 +1,21 @@
MIT License
Copyright (c) 2026 MARTe Community
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

24
Makefile Normal file
View File

@@ -0,0 +1,24 @@
BINARY_NAME=mdt
BUILD_DIR=build
.PHONY: all build test coverage clean install
all: test build
build:
mkdir -p $(BUILD_DIR)
go build -o $(BUILD_DIR)/$(BINARY_NAME) ./cmd/mdt
test:
go test -v ./...
coverage:
go test -cover -coverprofile=coverage.out ./test/... -coverpkg=./internal/...
go tool cover -func=coverage.out
clean:
rm -rf $(BUILD_DIR)
rm -f coverage.out
install:
go install ./cmd/mdt

96
README.md Normal file
View File

@@ -0,0 +1,96 @@
# MARTe Development Tools (mdt)
`mdt` is a comprehensive toolkit for developing, validating, and building configurations for the MARTe real-time framework. It provides a CLI and a Language Server Protocol (LSP) server to enhance the development experience.
## Features
- **LSP Server**: Real-time syntax checking, validation, autocomplete, hover documentation, and navigation (Go to Definition/References).
- **Builder**: Merges multiple configuration files into a single, ordered output file.
- **Formatter**: Standardizes configuration file formatting.
- **Validator**: Advanced semantic validation using [CUE](https://cuelang.org/) schemas, ensuring type safety and structural correctness.
## Installation
### From Source
Requirements: Go 1.21+
```bash
go install github.com/marte-community/marte-dev-tools/cmd/mdt@latest
```
## Usage
### CLI Commands
- **Check**: Run validation on a file or project.
```bash
mdt check path/to/project
```
- **Build**: Merge project files into a single output.
```bash
mdt build -o output.marte main.marte
```
- **Format**: Format configuration files.
```bash
mdt fmt path/to/file.marte
```
- **LSP**: Start the language server (used by editor plugins).
```bash
mdt lsp
```
### Editor Integration
`mdt lsp` implements the Language Server Protocol. You can use it with any LSP-compatible editor (VS Code, Neovim, Emacs, etc.).
## MARTe Configuration
The tools support the MARTe configuration format with extended features:
- **Objects**: `+Node = { Class = ... }`
- **Signals**: `Signal = { Type = ... }`
- **Namespaces**: `#package PROJECT.NODE` for organizing multi-file projects.
### Validation & Schema
Validation is fully schema-driven using CUE.
- **Built-in Schema**: Covers standard MARTe classes (`StateMachine`, `GAM`, `DataSource`, `RealTimeApplication`, etc.).
- **Custom Schema**: Add a `.marte_schema.cue` file to your project root to extend or override definitions.
**Example `.marte_schema.cue`:**
```cue
package schema
#Classes: {
MyCustomGAM: {
Param1: int
Param2?: string
...
}
}
```
### Pragmas (Suppressing Warnings)
Use comments starting with `//!` to control validation behavior:
- `//!unused: Reason` - Suppress "Unused GAM" or "Unused Signal" warnings.
- `//!implicit: Reason` - Suppress "Implicitly Defined Signal" warnings.
- `//!cast(DefinedType, UsageType)` - Allow type mismatch between definition and usage (e.g. `//!cast(uint32, int32)`).
- `//!allow(unused)` - Global suppression for the file.
## Development
### Building
```bash
go build ./cmd/mdt
```
### Running Tests
```bash
go test ./...
```
## License
MIT

View File

@@ -4,13 +4,13 @@ import (
"bytes" "bytes"
"os" "os"
"github.com/marte-dev/marte-dev-tools/internal/builder" "github.com/marte-community/marte-dev-tools/internal/builder"
"github.com/marte-dev/marte-dev-tools/internal/formatter" "github.com/marte-community/marte-dev-tools/internal/formatter"
"github.com/marte-dev/marte-dev-tools/internal/index" "github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-dev/marte-dev-tools/internal/logger" "github.com/marte-community/marte-dev-tools/internal/logger"
"github.com/marte-dev/marte-dev-tools/internal/lsp" "github.com/marte-community/marte-dev-tools/internal/lsp"
"github.com/marte-dev/marte-dev-tools/internal/parser" "github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-dev/marte-dev-tools/internal/validator" "github.com/marte-community/marte-dev-tools/internal/validator"
) )
func main() { func main() {

View File

@@ -0,0 +1,27 @@
//!allow(unused): Ignore unused GAMs in this file
//!allow(implicit): Ignore implicit signals in this file
+Data = {
Class = ReferenceContainer
+MyDS = {
Class = FileReader
Filename = "test"
Signals = {}
}
}
+MyGAM = {
Class = IOGAM
InputSignals = {
// Implicit signal (not in MyDS)
ImplicitSig = {
DataSource = MyDS
Type = uint32
}
}
}
// Unused GAM
+UnusedGAM = {
Class = IOGAM
}

6417
examples/test_app.marte Normal file

File diff suppressed because it is too large Load Diff

17
go.mod
View File

@@ -1,3 +1,18 @@
module github.com/marte-dev/marte-dev-tools module github.com/marte-community/marte-dev-tools
go 1.25.6 go 1.25.6
require cuelang.org/go v0.15.3
require (
github.com/cockroachdb/apd/v3 v3.2.1 // indirect
github.com/emicklei/proto v1.14.2 // indirect
github.com/google/uuid v1.6.0 // indirect
github.com/mitchellh/go-wordwrap v1.0.1 // indirect
github.com/pelletier/go-toml/v2 v2.2.4 // indirect
github.com/protocolbuffers/txtpbfmt v0.0.0-20251016062345-16587c79cd91 // indirect
go.yaml.in/yaml/v3 v3.0.4 // indirect
golang.org/x/net v0.46.0 // indirect
golang.org/x/text v0.30.0 // indirect
google.golang.org/protobuf v1.33.0 // indirect
)

53
go.sum Normal file
View File

@@ -0,0 +1,53 @@
cuelabs.dev/go/oci/ociregistry v0.0.0-20250722084951-074d06050084 h1:4k1yAtPvZJZQTu8DRY8muBo0LHv6TqtrE0AO5n6IPYs=
cuelabs.dev/go/oci/ociregistry v0.0.0-20250722084951-074d06050084/go.mod h1:4WWeZNxUO1vRoZWAHIG0KZOd6dA25ypyWuwD3ti0Tdc=
cuelang.org/go v0.15.3 h1:JKR/lZVwuIGlLTGIaJ0jONz9+CK3UDx06sQ6DDxNkaE=
cuelang.org/go v0.15.3/go.mod h1:NYw6n4akZcTjA7QQwJ1/gqWrrhsN4aZwhcAL0jv9rZE=
github.com/cockroachdb/apd/v3 v3.2.1 h1:U+8j7t0axsIgvQUqthuNm82HIrYXodOV2iWLWtEaIwg=
github.com/cockroachdb/apd/v3 v3.2.1/go.mod h1:klXJcjp+FffLTHlhIG69tezTDvdP065naDsHzKhYSqc=
github.com/emicklei/proto v1.14.2 h1:wJPxPy2Xifja9cEMrcA/g08art5+7CGJNFNk35iXC1I=
github.com/emicklei/proto v1.14.2/go.mod h1:rn1FgRS/FANiZdD2djyH7TMA9jdRDcYQ9IEN9yvjX0A=
github.com/go-quicktest/qt v1.101.0 h1:O1K29Txy5P2OK0dGo59b7b0LR6wKfIhttaAhHUyn7eI=
github.com/go-quicktest/qt v1.101.0/go.mod h1:14Bz/f7NwaXPtdYEgzsx46kqSxVwTbzVZsDC26tQJow=
github.com/google/go-cmp v0.7.0 h1:wk8382ETsv4JYUZwIsn6YpYiWiBsYLSJiTsyBybVuN8=
github.com/google/go-cmp v0.7.0/go.mod h1:pXiqmnSA92OHEEa9HXL2W4E7lf9JzCmGVUdgjX3N/iU=
github.com/google/uuid v1.6.0 h1:NIvaJDMOsjHA8n1jAhLSgzrAzy1Hgr+hNrb57e+94F0=
github.com/google/uuid v1.6.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=
github.com/kr/pretty v0.3.1 h1:flRD4NNwYAUpkphVc1HcthR4KEIFJ65n8Mw5qdRn3LE=
github.com/kr/pretty v0.3.1/go.mod h1:hoEshYVHaxMs3cyo3Yncou5ZscifuDolrwPKZanG3xk=
github.com/kr/text v0.2.0 h1:5Nx0Ya0ZqY2ygV366QzturHI13Jq95ApcVaJBhpS+AY=
github.com/kr/text v0.2.0/go.mod h1:eLer722TekiGuMkidMxC/pM04lWEeraHUUmBw8l2grE=
github.com/kylelemons/godebug v1.1.0 h1:RPNrshWIDI6G2gRW9EHilWtl7Z6Sb1BR0xunSBf0SNc=
github.com/kylelemons/godebug v1.1.0/go.mod h1:9/0rRGxNHcop5bhtWyNeEfOS8JIWk580+fNqagV/RAw=
github.com/lib/pq v1.10.7 h1:p7ZhMD+KsSRozJr34udlUrhboJwWAgCg34+/ZZNvZZw=
github.com/lib/pq v1.10.7/go.mod h1:AlVN5x4E4T544tWzH6hKfbfQvm3HdbOxrmggDNAPY9o=
github.com/mitchellh/go-wordwrap v1.0.1 h1:TLuKupo69TCn6TQSyGxwI1EblZZEsQ0vMlAFQflz0v0=
github.com/mitchellh/go-wordwrap v1.0.1/go.mod h1:R62XHJLzvMFRBbcrT7m7WgmE1eOyTSsCt+hzestvNj0=
github.com/opencontainers/go-digest v1.0.0 h1:apOUWs51W5PlhuyGyz9FCeeBIOUDA/6nW8Oi/yOhh5U=
github.com/opencontainers/go-digest v1.0.0/go.mod h1:0JzlMkj0TRzQZfJkVvzbP0HBR3IKzErnv2BNG4W4MAM=
github.com/opencontainers/image-spec v1.1.1 h1:y0fUlFfIZhPF1W537XOLg0/fcx6zcHCJwooC2xJA040=
github.com/opencontainers/image-spec v1.1.1/go.mod h1:qpqAh3Dmcf36wStyyWU+kCeDgrGnAve2nCC8+7h8Q0M=
github.com/pelletier/go-toml/v2 v2.2.4 h1:mye9XuhQ6gvn5h28+VilKrrPoQVanw5PMw/TB0t5Ec4=
github.com/pelletier/go-toml/v2 v2.2.4/go.mod h1:2gIqNv+qfxSVS7cM2xJQKtLSTLUE9V8t9Stt+h56mCY=
github.com/protocolbuffers/txtpbfmt v0.0.0-20251016062345-16587c79cd91 h1:s1LvMaU6mVwoFtbxv/rCZKE7/fwDmDY684FfUe4c1Io=
github.com/protocolbuffers/txtpbfmt v0.0.0-20251016062345-16587c79cd91/go.mod h1:JSbkp0BviKovYYt9XunS95M3mLPibE9bGg+Y95DsEEY=
github.com/rogpeppe/go-internal v1.14.1 h1:UQB4HGPB6osV0SQTLymcB4TgvyWu6ZyliaW0tI/otEQ=
github.com/rogpeppe/go-internal v1.14.1/go.mod h1:MaRKkUm5W0goXpeCfT7UZI6fk/L7L7so1lCWt35ZSgc=
go.yaml.in/yaml/v3 v3.0.4 h1:tfq32ie2Jv2UxXFdLJdh3jXuOzWiL1fo0bu/FbuKpbc=
go.yaml.in/yaml/v3 v3.0.4/go.mod h1:DhzuOOF2ATzADvBadXxruRBLzYTpT36CKvDb3+aBEFg=
golang.org/x/mod v0.29.0 h1:HV8lRxZC4l2cr3Zq1LvtOsi/ThTgWnUk/y64QSs8GwA=
golang.org/x/mod v0.29.0/go.mod h1:NyhrlYXJ2H4eJiRy/WDBO6HMqZQ6q9nk4JzS3NuCK+w=
golang.org/x/net v0.46.0 h1:giFlY12I07fugqwPuWJi68oOnpfqFnJIJzaIIm2JVV4=
golang.org/x/net v0.46.0/go.mod h1:Q9BGdFy1y4nkUwiLvT5qtyhAnEHgnQ/zd8PfU6nc210=
golang.org/x/oauth2 v0.32.0 h1:jsCblLleRMDrxMN29H3z/k1KliIvpLgCkE6R8FXXNgY=
golang.org/x/oauth2 v0.32.0/go.mod h1:lzm5WQJQwKZ3nwavOZ3IS5Aulzxi68dUSgRHujetwEA=
golang.org/x/sync v0.17.0 h1:l60nONMj9l5drqw6jlhIELNv9I0A4OFgRsG9k2oT9Ug=
golang.org/x/sync v0.17.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
golang.org/x/text v0.30.0 h1:yznKA/E9zq54KzlzBEAWn1NXSQ8DIp/NYMy88xJjl4k=
golang.org/x/text v0.30.0/go.mod h1:yDdHFIX9t+tORqspjENWgzaCVXgk0yYnYuSZ8UzzBVM=
golang.org/x/tools v0.38.0 h1:Hx2Xv8hISq8Lm16jvBZ2VQf+RLmbd7wVUsALibYI/IQ=
golang.org/x/tools v0.38.0/go.mod h1:yEsQ/d/YK8cjh0L6rZlY8tgtlKiBNTL14pGDJPJpYQs=
google.golang.org/protobuf v1.33.0 h1:uNO2rsAINq/JlFpSdYEKIZ0uKD/R9cpdv0T+yoGwGmI=
google.golang.org/protobuf v1.33.0/go.mod h1:c6P6GXX6sHbq/GpV6MGZEdwhWPcYBgnhAHhKbcUYpos=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20180628173108-788fd7840127 h1:qIbj1fsPNlZgppZ+VLlY7N33q108Sa+fhmuc+sWQYwY=
gopkg.in/check.v1 v1.0.0-20180628173108-788fd7840127/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=

View File

@@ -6,8 +6,8 @@ import (
"sort" "sort"
"strings" "strings"
"github.com/marte-dev/marte-dev-tools/internal/index" "github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-dev/marte-dev-tools/internal/parser" "github.com/marte-community/marte-dev-tools/internal/parser"
) )
type Builder struct { type Builder struct {
@@ -71,87 +71,39 @@ func (b *Builder) writeNodeContent(f *os.File, node *index.ProjectNode, indent i
indentStr := strings.Repeat(" ", indent) indentStr := strings.Repeat(" ", indent)
// If this node has a RealName (e.g. +App), we print it as an object definition // If this node has a RealName (e.g. +App), we print it as an object definition
// UNLESS it is the top-level output file itself?
// If we are writing "App.marte", maybe we are writing the *body* of App?
// Spec: "unifying multi-file project into a single configuration output"
// Let's assume we print the Node itself.
if node.RealName != "" { if node.RealName != "" {
fmt.Fprintf(f, "%s%s = {\n", indentStr, node.RealName) fmt.Fprintf(f, "%s%s = {\n", indentStr, node.RealName)
indent++ indent++
indentStr = strings.Repeat(" ", indent) indentStr = strings.Repeat(" ", indent)
} }
writtenChildren := make(map[string]bool)
// 2. Write definitions from fragments // 2. Write definitions from fragments
for _, frag := range node.Fragments { for _, frag := range node.Fragments {
// Use formatter logic to print definitions
// We need a temporary Config to use Formatter?
// Or just reimplement basic printing? Formatter is better.
// But Formatter prints to io.Writer.
// We can reuse formatDefinition logic if we exposed it, or just copy basic logic.
// Since we need to respect indentation, using Formatter.Format might be tricky
// unless we wrap definitions in a dummy structure.
for _, def := range frag.Definitions { for _, def := range frag.Definitions {
// Basic formatting for now, referencing formatter style switch d := def.(type) {
b.writeDefinition(f, def, indent) case *parser.Field:
b.writeDefinition(f, d, indent)
case *parser.ObjectNode:
norm := index.NormalizeName(d.Name)
if child, ok := node.Children[norm]; ok {
if !writtenChildren[norm] {
b.writeNodeContent(f, child, indent)
writtenChildren[norm] = true
}
}
}
} }
} }
// 3. Write Children (recursively) // 3. Write Children (recursively)
// Children are sub-nodes defined implicitly via #package A.B or explicitly +Sub
// Explicit +Sub are handled via Fragments logic (they are definitions in fragments).
// Implicit nodes (from #package A.B.C where B was never explicitly defined)
// show up in Children map but maybe not in Fragments?
// If a Child is NOT in fragments (implicit), we still need to write it.
// If it IS in fragments (explicit +Child), it was handled in loop above?
// Wait. My Indexer puts `+Sub` into `node.Children["Sub"]` AND adds a `Fragment` to `node` containing `+Sub` object?
// Let's check Indexer.
// Case ObjectNode:
// Adds Fragment to `child` (the Sub node).
// Does NOT add `ObjectNode` definition to `node`'s fragment list?
// "pt.addObjectFragment(child...)"
// It does NOT add to `fileFragment.Definitions`.
// So `node.Fragments` only contains Fields!
// Children are all in `node.Children`.
// So:
// 1. Write Fields (from Fragments).
// 2. Write Children (from Children map).
// But wait, Fragments might have order?
// "Relative ordering within a file is preserved."
// My Indexer splits Fields and Objects.
// Fields go to Fragments. Objects go to Children.
// This loses the relative order between Fields and Objects in the source file!
// Correct Indexer approach for preserving order:
// `Fragment` should contain a list of `Entry`.
// `Entry` can be `Field` OR `ChildNodeName`.
// But I just rewrote Indexer to split them.
// If strict order is required "within a file", my Indexer is slightly lossy regarding Field vs Object order.
// Spec: "Relative ordering within a file is preserved."
// To fix this without another full rewrite:
// Iterating `node.Children` alphabetically is arbitrary.
// We should ideally iterate them in the order they appear.
// For now, I will proceed with writing Children after Fields, which is a common convention,
// unless strict interleaving is required.
// Given "Class first" rule, reordering happens anyway.
// Sorting Children?
// Maybe keep a list of OrderedChildren in ProjectNode?
sortedChildren := make([]string, 0, len(node.Children)) sortedChildren := make([]string, 0, len(node.Children))
for k := range node.Children { for k := range node.Children {
if !writtenChildren[k] {
sortedChildren = append(sortedChildren, k) sortedChildren = append(sortedChildren, k)
} }
}
sort.Strings(sortedChildren) // Alphabetical for determinism sort.Strings(sortedChildren) // Alphabetical for determinism
for _, k := range sortedChildren { for _, k := range sortedChildren {

View File

@@ -6,7 +6,7 @@ import (
"sort" "sort"
"strings" "strings"
"github.com/marte-dev/marte-dev-tools/internal/parser" "github.com/marte-community/marte-dev-tools/internal/parser"
) )
type Insertable struct { type Insertable struct {

View File

@@ -5,14 +5,15 @@ import (
"path/filepath" "path/filepath"
"strings" "strings"
"github.com/marte-dev/marte-dev-tools/internal/logger" "github.com/marte-community/marte-dev-tools/internal/logger"
"github.com/marte-dev/marte-dev-tools/internal/parser" "github.com/marte-community/marte-dev-tools/internal/parser"
) )
type ProjectTree struct { type ProjectTree struct {
Root *ProjectNode Root *ProjectNode
References []Reference References []Reference
IsolatedFiles map[string]*ProjectNode IsolatedFiles map[string]*ProjectNode
GlobalPragmas map[string][]string
} }
func (pt *ProjectTree) ScanDirectory(rootPath string) error { func (pt *ProjectTree) ScanDirectory(rootPath string) error {
@@ -50,6 +51,8 @@ type ProjectNode struct {
Children map[string]*ProjectNode Children map[string]*ProjectNode
Parent *ProjectNode Parent *ProjectNode
Metadata map[string]string // Store extra info like Class, Type, Size Metadata map[string]string // Store extra info like Class, Type, Size
Target *ProjectNode // Points to referenced node (for Direct References/Links)
Pragmas []string
} }
type Fragment struct { type Fragment struct {
@@ -57,6 +60,7 @@ type Fragment struct {
Definitions []parser.Definition Definitions []parser.Definition
IsObject bool IsObject bool
ObjectPos parser.Position ObjectPos parser.Position
EndPos parser.Position
Doc string // Documentation for this fragment (if object) Doc string // Documentation for this fragment (if object)
} }
@@ -67,6 +71,7 @@ func NewProjectTree() *ProjectTree {
Metadata: make(map[string]string), Metadata: make(map[string]string),
}, },
IsolatedFiles: make(map[string]*ProjectNode), IsolatedFiles: make(map[string]*ProjectNode),
GlobalPragmas: make(map[string][]string),
} }
} }
@@ -87,6 +92,7 @@ func (pt *ProjectTree) RemoveFile(file string) {
pt.References = newRefs pt.References = newRefs
delete(pt.IsolatedFiles, file) delete(pt.IsolatedFiles, file)
delete(pt.GlobalPragmas, file)
pt.removeFileFromNode(pt.Root, file) pt.removeFileFromNode(pt.Root, file)
} }
@@ -154,6 +160,15 @@ func (pt *ProjectTree) extractFieldMetadata(node *ProjectNode, f *parser.Field)
func (pt *ProjectTree) AddFile(file string, config *parser.Configuration) { func (pt *ProjectTree) AddFile(file string, config *parser.Configuration) {
pt.RemoveFile(file) pt.RemoveFile(file)
// Collect global pragmas
for _, p := range config.Pragmas {
txt := strings.TrimSpace(strings.TrimPrefix(p.Text, "//!"))
normalized := strings.ReplaceAll(txt, " ", "")
if strings.HasPrefix(normalized, "allow(") || strings.HasPrefix(normalized, "ignore(") {
pt.GlobalPragmas[file] = append(pt.GlobalPragmas[file], txt)
}
}
if config.Package == nil { if config.Package == nil {
node := &ProjectNode{ node := &ProjectNode{
Children: make(map[string]*ProjectNode), Children: make(map[string]*ProjectNode),
@@ -200,12 +215,14 @@ func (pt *ProjectTree) populateNode(node *ProjectNode, file string, config *pars
for _, def := range config.Definitions { for _, def := range config.Definitions {
doc := pt.findDoc(config.Comments, def.Pos()) doc := pt.findDoc(config.Comments, def.Pos())
pragmas := pt.findPragmas(config.Pragmas, def.Pos())
switch d := def.(type) { switch d := def.(type) {
case *parser.Field: case *parser.Field:
fileFragment.Definitions = append(fileFragment.Definitions, d) fileFragment.Definitions = append(fileFragment.Definitions, d)
pt.indexValue(file, d.Value) pt.indexValue(file, d.Value)
case *parser.ObjectNode: case *parser.ObjectNode:
fileFragment.Definitions = append(fileFragment.Definitions, d)
norm := NormalizeName(d.Name) norm := NormalizeName(d.Name)
if _, ok := node.Children[norm]; !ok { if _, ok := node.Children[norm]; !ok {
node.Children[norm] = &ProjectNode{ node.Children[norm] = &ProjectNode{
@@ -228,7 +245,11 @@ func (pt *ProjectTree) populateNode(node *ProjectNode, file string, config *pars
child.Doc += doc child.Doc += doc
} }
pt.addObjectFragment(child, file, d, doc, config.Comments) if len(pragmas) > 0 {
child.Pragmas = append(child.Pragmas, pragmas...)
}
pt.addObjectFragment(child, file, d, doc, config.Comments, config.Pragmas)
} }
} }
@@ -237,16 +258,18 @@ func (pt *ProjectTree) populateNode(node *ProjectNode, file string, config *pars
} }
} }
func (pt *ProjectTree) addObjectFragment(node *ProjectNode, file string, obj *parser.ObjectNode, doc string, comments []parser.Comment) { func (pt *ProjectTree) addObjectFragment(node *ProjectNode, file string, obj *parser.ObjectNode, doc string, comments []parser.Comment, pragmas []parser.Pragma) {
frag := &Fragment{ frag := &Fragment{
File: file, File: file,
IsObject: true, IsObject: true,
ObjectPos: obj.Position, ObjectPos: obj.Position,
EndPos: obj.Subnode.EndPosition,
Doc: doc, Doc: doc,
} }
for _, def := range obj.Subnode.Definitions { for _, def := range obj.Subnode.Definitions {
subDoc := pt.findDoc(comments, def.Pos()) subDoc := pt.findDoc(comments, def.Pos())
subPragmas := pt.findPragmas(pragmas, def.Pos())
switch d := def.(type) { switch d := def.(type) {
case *parser.Field: case *parser.Field:
@@ -254,6 +277,7 @@ func (pt *ProjectTree) addObjectFragment(node *ProjectNode, file string, obj *pa
pt.indexValue(file, d.Value) pt.indexValue(file, d.Value)
pt.extractFieldMetadata(node, d) pt.extractFieldMetadata(node, d)
case *parser.ObjectNode: case *parser.ObjectNode:
frag.Definitions = append(frag.Definitions, d)
norm := NormalizeName(d.Name) norm := NormalizeName(d.Name)
if _, ok := node.Children[norm]; !ok { if _, ok := node.Children[norm]; !ok {
node.Children[norm] = &ProjectNode{ node.Children[norm] = &ProjectNode{
@@ -276,7 +300,11 @@ func (pt *ProjectTree) addObjectFragment(node *ProjectNode, file string, obj *pa
child.Doc += subDoc child.Doc += subDoc
} }
pt.addObjectFragment(child, file, d, subDoc, comments) if len(subPragmas) > 0 {
child.Pragmas = append(child.Pragmas, subPragmas...)
}
pt.addObjectFragment(child, file, d, subDoc, comments, pragmas)
} }
} }
@@ -321,6 +349,30 @@ func (pt *ProjectTree) findDoc(comments []parser.Comment, pos parser.Position) s
return docBuilder.String() return docBuilder.String()
} }
func (pt *ProjectTree) findPragmas(pragmas []parser.Pragma, pos parser.Position) []string {
var found []string
targetLine := pos.Line - 1
for i := len(pragmas) - 1; i >= 0; i-- {
p := pragmas[i]
if p.Position.Line > pos.Line {
continue
}
if p.Position.Line == pos.Line {
continue
}
if p.Position.Line == targetLine {
txt := strings.TrimSpace(strings.TrimPrefix(p.Text, "//!"))
found = append(found, txt)
targetLine--
} else if p.Position.Line < targetLine {
break
}
}
return found
}
func (pt *ProjectTree) indexValue(file string, val parser.Value) { func (pt *ProjectTree) indexValue(file string, val parser.Value) {
switch v := val.(type) { switch v := val.(type) {
case *parser.ReferenceValue: case *parser.ReferenceValue:
@@ -340,25 +392,65 @@ func (pt *ProjectTree) ResolveReferences() {
for i := range pt.References { for i := range pt.References {
ref := &pt.References[i] ref := &pt.References[i]
if isoNode, ok := pt.IsolatedFiles[ref.File]; ok { if isoNode, ok := pt.IsolatedFiles[ref.File]; ok {
ref.Target = pt.findNode(isoNode, ref.Name) ref.Target = pt.FindNode(isoNode, ref.Name, nil)
} else { } else {
ref.Target = pt.findNode(pt.Root, ref.Name) ref.Target = pt.FindNode(pt.Root, ref.Name, nil)
} }
} }
} }
func (pt *ProjectTree) findNode(root *ProjectNode, name string) *ProjectNode { func (pt *ProjectTree) FindNode(root *ProjectNode, name string, predicate func(*ProjectNode) bool) *ProjectNode {
if strings.Contains(name, ".") {
parts := strings.Split(name, ".")
rootName := parts[0]
var candidates []*ProjectNode
pt.findAllNodes(root, rootName, &candidates)
for _, cand := range candidates {
curr := cand
valid := true
for i := 1; i < len(parts); i++ {
nextName := parts[i]
normNext := NormalizeName(nextName)
if child, ok := curr.Children[normNext]; ok {
curr = child
} else {
valid = false
break
}
}
if valid {
if predicate == nil || predicate(curr) {
return curr
}
}
}
return nil
}
if root.RealName == name || root.Name == name { if root.RealName == name || root.Name == name {
if predicate == nil || predicate(root) {
return root return root
} }
}
for _, child := range root.Children { for _, child := range root.Children {
if res := pt.findNode(child, name); res != nil { if res := pt.FindNode(child, name, predicate); res != nil {
return res return res
} }
} }
return nil return nil
} }
func (pt *ProjectTree) findAllNodes(root *ProjectNode, name string, results *[]*ProjectNode) {
if root.RealName == name || root.Name == name {
*results = append(*results, root)
}
for _, child := range root.Children {
pt.findAllNodes(child, name, results)
}
}
type QueryResult struct { type QueryResult struct {
Node *ProjectNode Node *ProjectNode
Field *parser.Field Field *parser.Field
@@ -384,6 +476,22 @@ func (pt *ProjectTree) Query(file string, line, col int) *QueryResult {
return pt.queryNode(pt.Root, file, line, col) return pt.queryNode(pt.Root, file, line, col)
} }
func (pt *ProjectTree) Walk(visitor func(*ProjectNode)) {
if pt.Root != nil {
pt.walkRecursive(pt.Root, visitor)
}
for _, node := range pt.IsolatedFiles {
pt.walkRecursive(node, visitor)
}
}
func (pt *ProjectTree) walkRecursive(node *ProjectNode, visitor func(*ProjectNode)) {
visitor(node)
for _, child := range node.Children {
pt.walkRecursive(child, visitor)
}
}
func (pt *ProjectTree) queryNode(node *ProjectNode, file string, line, col int) *QueryResult { func (pt *ProjectTree) queryNode(node *ProjectNode, file string, line, col int) *QueryResult {
for _, frag := range node.Fragments { for _, frag := range node.Fragments {
if frag.File == file { if frag.File == file {
@@ -410,3 +518,44 @@ func (pt *ProjectTree) queryNode(node *ProjectNode, file string, line, col int)
} }
return nil return nil
} }
func (pt *ProjectTree) GetNodeContaining(file string, pos parser.Position) *ProjectNode {
if isoNode, ok := pt.IsolatedFiles[file]; ok {
if found := pt.findNodeContaining(isoNode, file, pos); found != nil {
return found
}
return isoNode
}
if pt.Root != nil {
if found := pt.findNodeContaining(pt.Root, file, pos); found != nil {
return found
}
for _, frag := range pt.Root.Fragments {
if frag.File == file && !frag.IsObject {
return pt.Root
}
}
}
return nil
}
func (pt *ProjectTree) findNodeContaining(node *ProjectNode, file string, pos parser.Position) *ProjectNode {
for _, child := range node.Children {
if res := pt.findNodeContaining(child, file, pos); res != nil {
return res
}
}
for _, frag := range node.Fragments {
if frag.File == file && frag.IsObject {
start := frag.ObjectPos
end := frag.EndPos
if (pos.Line > start.Line || (pos.Line == start.Line && pos.Column >= start.Column)) &&
(pos.Line < end.Line || (pos.Line == end.Line && pos.Column <= end.Column)) {
return node
}
}
}
return nil
}

View File

@@ -7,15 +7,51 @@ import (
"fmt" "fmt"
"io" "io"
"os" "os"
"regexp"
"strings" "strings"
"github.com/marte-dev/marte-dev-tools/internal/formatter" "github.com/marte-community/marte-dev-tools/internal/formatter"
"github.com/marte-dev/marte-dev-tools/internal/index" "github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-dev/marte-dev-tools/internal/logger" "github.com/marte-community/marte-dev-tools/internal/logger"
"github.com/marte-dev/marte-dev-tools/internal/parser" "github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-dev/marte-dev-tools/internal/validator" "github.com/marte-community/marte-dev-tools/internal/schema"
"github.com/marte-community/marte-dev-tools/internal/validator"
"cuelang.org/go/cue"
) )
type CompletionParams struct {
TextDocument TextDocumentIdentifier `json:"textDocument"`
Position Position `json:"position"`
Context CompletionContext `json:"context,omitempty"`
}
type CompletionContext struct {
TriggerKind int `json:"triggerKind"`
}
type CompletionItem struct {
Label string `json:"label"`
Kind int `json:"kind"`
Detail string `json:"detail,omitempty"`
Documentation string `json:"documentation,omitempty"`
InsertText string `json:"insertText,omitempty"`
InsertTextFormat int `json:"insertTextFormat,omitempty"` // 1: PlainText, 2: Snippet
SortText string `json:"sortText,omitempty"`
}
type CompletionList struct {
IsIncomplete bool `json:"isIncomplete"`
Items []CompletionItem `json:"items"`
}
var Tree = index.NewProjectTree()
var Documents = make(map[string]string)
var ProjectRoot string
var GlobalSchema *schema.Schema
type JsonRpcMessage struct { type JsonRpcMessage struct {
Jsonrpc string `json:"jsonrpc"` Jsonrpc string `json:"jsonrpc"`
Method string `json:"method,omitempty"` Method string `json:"method,omitempty"`
@@ -135,9 +171,6 @@ type TextEdit struct {
NewText string `json:"newText"` NewText string `json:"newText"`
} }
var tree = index.NewProjectTree()
var documents = make(map[string]string)
var projectRoot string
func RunServer() { func RunServer() {
reader := bufio.NewReader(os.Stdin) reader := bufio.NewReader(os.Stdin)
@@ -151,7 +184,7 @@ func RunServer() {
continue continue
} }
handleMessage(msg) HandleMessage(msg)
} }
} }
@@ -181,7 +214,7 @@ func readMessage(reader *bufio.Reader) (*JsonRpcMessage, error) {
return &msg, err return &msg, err
} }
func handleMessage(msg *JsonRpcMessage) { func HandleMessage(msg *JsonRpcMessage) {
switch msg.Method { switch msg.Method {
case "initialize": case "initialize":
var params InitializeParams var params InitializeParams
@@ -194,10 +227,13 @@ func handleMessage(msg *JsonRpcMessage) {
} }
if root != "" { if root != "" {
projectRoot = root ProjectRoot = root
logger.Printf("Scanning workspace: %s\n", root) logger.Printf("Scanning workspace: %s\n", root)
tree.ScanDirectory(root) if err := Tree.ScanDirectory(root); err != nil {
tree.ResolveReferences() logger.Printf("ScanDirectory failed: %v\n", err)
}
Tree.ResolveReferences()
GlobalSchema = schema.LoadFullSchema(ProjectRoot)
} }
} }
@@ -208,6 +244,9 @@ func handleMessage(msg *JsonRpcMessage) {
"definitionProvider": true, "definitionProvider": true,
"referencesProvider": true, "referencesProvider": true,
"documentFormattingProvider": true, "documentFormattingProvider": true,
"completionProvider": map[string]any{
"triggerCharacters": []string{"=", " "},
},
}, },
}) })
case "initialized": case "initialized":
@@ -219,18 +258,18 @@ func handleMessage(msg *JsonRpcMessage) {
case "textDocument/didOpen": case "textDocument/didOpen":
var params DidOpenTextDocumentParams var params DidOpenTextDocumentParams
if err := json.Unmarshal(msg.Params, &params); err == nil { if err := json.Unmarshal(msg.Params, &params); err == nil {
handleDidOpen(params) HandleDidOpen(params)
} }
case "textDocument/didChange": case "textDocument/didChange":
var params DidChangeTextDocumentParams var params DidChangeTextDocumentParams
if err := json.Unmarshal(msg.Params, &params); err == nil { if err := json.Unmarshal(msg.Params, &params); err == nil {
handleDidChange(params) HandleDidChange(params)
} }
case "textDocument/hover": case "textDocument/hover":
var params HoverParams var params HoverParams
if err := json.Unmarshal(msg.Params, &params); err == nil { if err := json.Unmarshal(msg.Params, &params); err == nil {
logger.Printf("Hover: %s:%d", params.TextDocument.URI, params.Position.Line) logger.Printf("Hover: %s:%d", params.TextDocument.URI, params.Position.Line)
res := handleHover(params) res := HandleHover(params)
if res != nil { if res != nil {
logger.Printf("Res: %v", res.Contents) logger.Printf("Res: %v", res.Contents)
} else { } else {
@@ -244,17 +283,22 @@ func handleMessage(msg *JsonRpcMessage) {
case "textDocument/definition": case "textDocument/definition":
var params DefinitionParams var params DefinitionParams
if err := json.Unmarshal(msg.Params, &params); err == nil { if err := json.Unmarshal(msg.Params, &params); err == nil {
respond(msg.ID, handleDefinition(params)) respond(msg.ID, HandleDefinition(params))
} }
case "textDocument/references": case "textDocument/references":
var params ReferenceParams var params ReferenceParams
if err := json.Unmarshal(msg.Params, &params); err == nil { if err := json.Unmarshal(msg.Params, &params); err == nil {
respond(msg.ID, handleReferences(params)) respond(msg.ID, HandleReferences(params))
}
case "textDocument/completion":
var params CompletionParams
if err := json.Unmarshal(msg.Params, &params); err == nil {
respond(msg.ID, HandleCompletion(params))
} }
case "textDocument/formatting": case "textDocument/formatting":
var params DocumentFormattingParams var params DocumentFormattingParams
if err := json.Unmarshal(msg.Params, &params); err == nil { if err := json.Unmarshal(msg.Params, &params); err == nil {
respond(msg.ID, handleFormatting(params)) respond(msg.ID, HandleFormatting(params))
} }
} }
} }
@@ -263,37 +307,51 @@ func uriToPath(uri string) string {
return strings.TrimPrefix(uri, "file://") return strings.TrimPrefix(uri, "file://")
} }
func handleDidOpen(params DidOpenTextDocumentParams) { func HandleDidOpen(params DidOpenTextDocumentParams) {
path := uriToPath(params.TextDocument.URI) path := uriToPath(params.TextDocument.URI)
documents[params.TextDocument.URI] = params.TextDocument.Text Documents[params.TextDocument.URI] = params.TextDocument.Text
p := parser.NewParser(params.TextDocument.Text) p := parser.NewParser(params.TextDocument.Text)
config, err := p.Parse() config, err := p.Parse()
if err == nil {
tree.AddFile(path, config) if err != nil {
tree.ResolveReferences() publishParserError(params.TextDocument.URI, err)
} else {
publishParserError(params.TextDocument.URI, nil)
}
if config != nil {
Tree.AddFile(path, config)
Tree.ResolveReferences()
runValidation(params.TextDocument.URI) runValidation(params.TextDocument.URI)
} }
} }
func handleDidChange(params DidChangeTextDocumentParams) { func HandleDidChange(params DidChangeTextDocumentParams) {
if len(params.ContentChanges) == 0 { if len(params.ContentChanges) == 0 {
return return
} }
text := params.ContentChanges[0].Text text := params.ContentChanges[0].Text
documents[params.TextDocument.URI] = text Documents[params.TextDocument.URI] = text
path := uriToPath(params.TextDocument.URI) path := uriToPath(params.TextDocument.URI)
p := parser.NewParser(text) p := parser.NewParser(text)
config, err := p.Parse() config, err := p.Parse()
if err == nil {
tree.AddFile(path, config) if err != nil {
tree.ResolveReferences() publishParserError(params.TextDocument.URI, err)
} else {
publishParserError(params.TextDocument.URI, nil)
}
if config != nil {
Tree.AddFile(path, config)
Tree.ResolveReferences()
runValidation(params.TextDocument.URI) runValidation(params.TextDocument.URI)
} }
} }
func handleFormatting(params DocumentFormattingParams) []TextEdit { func HandleFormatting(params DocumentFormattingParams) []TextEdit {
uri := params.TextDocument.URI uri := params.TextDocument.URI
text, ok := documents[uri] text, ok := Documents[uri]
if !ok { if !ok {
return nil return nil
} }
@@ -325,7 +383,7 @@ func handleFormatting(params DocumentFormattingParams) []TextEdit {
} }
func runValidation(uri string) { func runValidation(uri string) {
v := validator.NewValidator(tree, projectRoot) v := validator.NewValidator(Tree, ProjectRoot)
v.ValidateProject() v.ValidateProject()
v.CheckUnused() v.CheckUnused()
@@ -334,7 +392,7 @@ func runValidation(uri string) {
// Collect all known files to ensure we clear diagnostics for fixed files // Collect all known files to ensure we clear diagnostics for fixed files
knownFiles := make(map[string]bool) knownFiles := make(map[string]bool)
collectFiles(tree.Root, knownFiles) collectFiles(Tree.Root, knownFiles)
// Initialize all known files with empty diagnostics // Initialize all known files with empty diagnostics
for f := range knownFiles { for f := range knownFiles {
@@ -378,6 +436,57 @@ func runValidation(uri string) {
} }
} }
func publishParserError(uri string, err error) {
if err == nil {
notification := JsonRpcMessage{
Jsonrpc: "2.0",
Method: "textDocument/publishDiagnostics",
Params: mustMarshal(PublishDiagnosticsParams{
URI: uri,
Diagnostics: []LSPDiagnostic{},
}),
}
send(notification)
return
}
var line, col int
var msg string
// Try parsing "line:col: message"
n, _ := fmt.Sscanf(err.Error(), "%d:%d: ", &line, &col)
if n == 2 {
parts := strings.SplitN(err.Error(), ": ", 2)
if len(parts) == 2 {
msg = parts[1]
}
} else {
// Fallback
line = 1
col = 1
msg = err.Error()
}
diag := LSPDiagnostic{
Range: Range{
Start: Position{Line: line - 1, Character: col - 1},
End: Position{Line: line - 1, Character: col},
},
Severity: 1, // Error
Message: msg,
Source: "mdt-parser",
}
notification := JsonRpcMessage{
Jsonrpc: "2.0",
Method: "textDocument/publishDiagnostics",
Params: mustMarshal(PublishDiagnosticsParams{
URI: uri,
Diagnostics: []LSPDiagnostic{diag},
}),
}
send(notification)
}
func collectFiles(node *index.ProjectNode, files map[string]bool) { func collectFiles(node *index.ProjectNode, files map[string]bool) {
for _, frag := range node.Fragments { for _, frag := range node.Fragments {
files[frag.File] = true files[frag.File] = true
@@ -392,12 +501,12 @@ func mustMarshal(v any) json.RawMessage {
return b return b
} }
func handleHover(params HoverParams) *Hover { func HandleHover(params HoverParams) *Hover {
path := uriToPath(params.TextDocument.URI) path := uriToPath(params.TextDocument.URI)
line := params.Position.Line + 1 line := params.Position.Line + 1
col := params.Position.Character + 1 col := params.Position.Character + 1
res := tree.Query(path, line, col) res := Tree.Query(path, line, col)
if res == nil { if res == nil {
logger.Printf("No object/node/reference found") logger.Printf("No object/node/reference found")
return nil return nil
@@ -406,7 +515,11 @@ func handleHover(params HoverParams) *Hover {
var content string var content string
if res.Node != nil { if res.Node != nil {
if res.Node.Target != nil {
content = fmt.Sprintf("**Link**: `%s` -> `%s`\n\n%s", res.Node.RealName, res.Node.Target.RealName, formatNodeInfo(res.Node.Target))
} else {
content = formatNodeInfo(res.Node) content = formatNodeInfo(res.Node)
}
} else if res.Field != nil { } else if res.Field != nil {
content = fmt.Sprintf("**Field**: `%s`", res.Field.Name) content = fmt.Sprintf("**Field**: `%s`", res.Field.Name)
} else if res.Reference != nil { } else if res.Reference != nil {
@@ -440,12 +553,314 @@ func handleHover(params HoverParams) *Hover {
} }
} }
func handleDefinition(params DefinitionParams) any { func HandleCompletion(params CompletionParams) *CompletionList {
uri := params.TextDocument.URI
path := uriToPath(uri)
text, ok := Documents[uri]
if !ok {
return nil
}
lines := strings.Split(text, "\n")
if params.Position.Line >= len(lines) {
return nil
}
lineStr := lines[params.Position.Line]
col := params.Position.Character
if col > len(lineStr) {
col = len(lineStr)
}
prefix := lineStr[:col]
// Case 1: Assigning a value (Ends with "=" or "= ")
if strings.Contains(prefix, "=") {
lastIdx := strings.LastIndex(prefix, "=")
beforeEqual := prefix[:lastIdx]
// Find the last identifier before '='
key := ""
re := regexp.MustCompile(`[a-zA-Z][a-zA-Z0-9_\-]*`)
matches := re.FindAllString(beforeEqual, -1)
if len(matches) > 0 {
key = matches[len(matches)-1]
}
if key == "Class" {
return suggestClasses()
}
container := Tree.GetNodeContaining(path, parser.Position{Line: params.Position.Line + 1, Column: col + 1})
if container != nil {
return suggestFieldValues(container, key, path)
}
return nil
}
// Case 2: Typing a key inside an object
container := Tree.GetNodeContaining(path, parser.Position{Line: params.Position.Line + 1, Column: col + 1})
if container != nil {
return suggestFields(container)
}
return nil
}
func suggestClasses() *CompletionList {
if GlobalSchema == nil {
return nil
}
classesVal := GlobalSchema.Value.LookupPath(cue.ParsePath("#Classes"))
if classesVal.Err() != nil {
return nil
}
iter, err := classesVal.Fields()
if err != nil {
return nil
}
var items []CompletionItem
for iter.Next() {
label := iter.Selector().String()
label = strings.Trim(label, "?!#")
items = append(items, CompletionItem{
Label: label,
Kind: 7, // Class
Detail: "MARTe Class",
})
}
return &CompletionList{Items: items}
}
func suggestFields(container *index.ProjectNode) *CompletionList {
cls := container.Metadata["Class"]
if cls == "" {
return &CompletionList{Items: []CompletionItem{{
Label: "Class",
Kind: 10, // Property
InsertText: "Class = ",
Detail: "Define object class",
}}}
}
if GlobalSchema == nil {
return nil
}
classPath := cue.ParsePath(fmt.Sprintf("#Classes.%s", cls))
classVal := GlobalSchema.Value.LookupPath(classPath)
if classVal.Err() != nil {
return nil
}
iter, err := classVal.Fields()
if err != nil {
return nil
}
existing := make(map[string]bool)
for _, frag := range container.Fragments {
for _, def := range frag.Definitions {
if f, ok := def.(*parser.Field); ok {
existing[f.Name] = true
}
}
}
for name := range container.Children {
existing[name] = true
}
var items []CompletionItem
for iter.Next() {
label := iter.Selector().String()
label = strings.Trim(label, "?!#")
// Skip if already present
if existing[label] {
continue
}
isOptional := iter.IsOptional()
kind := 10 // Property
detail := "Mandatory"
if isOptional {
detail = "Optional"
}
insertText := label + " = "
val := iter.Value()
if val.Kind() == cue.StructKind {
// Suggest as node
insertText = "+" + label + " = {\n\t$0\n}"
kind = 9 // Module
}
items = append(items, CompletionItem{
Label: label,
Kind: kind,
Detail: detail,
InsertText: insertText,
InsertTextFormat: 2, // Snippet
})
}
return &CompletionList{Items: items}
}
func suggestFieldValues(container *index.ProjectNode, field string, path string) *CompletionList {
var root *index.ProjectNode
if iso, ok := Tree.IsolatedFiles[path]; ok {
root = iso
} else {
root = Tree.Root
}
if field == "DataSource" {
return suggestObjects(root, "DataSource")
}
if field == "Functions" {
return suggestObjects(root, "GAM")
}
if field == "Type" {
return suggestSignalTypes()
}
if list := suggestCUEEnums(container, field); list != nil {
return list
}
return nil
}
func suggestSignalTypes() *CompletionList {
types := []string{
"uint8", "int8", "uint16", "int16", "uint32", "int32", "uint64", "int64",
"float32", "float64", "string", "bool", "char8",
}
var items []CompletionItem
for _, t := range types {
items = append(items, CompletionItem{
Label: t,
Kind: 13, // EnumMember
Detail: "Signal Type",
})
}
return &CompletionList{Items: items}
}
func suggestCUEEnums(container *index.ProjectNode, field string) *CompletionList {
if GlobalSchema == nil {
return nil
}
cls := container.Metadata["Class"]
if cls == "" {
return nil
}
classPath := cue.ParsePath(fmt.Sprintf("#Classes.%s.%s", cls, field))
val := GlobalSchema.Value.LookupPath(classPath)
if val.Err() != nil {
return nil
}
op, args := val.Expr()
var values []cue.Value
if op == cue.OrOp {
values = args
} else {
values = []cue.Value{val}
}
var items []CompletionItem
for _, v := range values {
if !v.IsConcrete() {
continue
}
str, err := v.String() // Returns quoted string for string values?
if err != nil {
continue
}
// Ensure strings are quoted
if v.Kind() == cue.StringKind && !strings.HasPrefix(str, "\"") {
str = fmt.Sprintf("\"%s\"", str)
}
items = append(items, CompletionItem{
Label: str,
Kind: 13, // EnumMember
Detail: "Enum Value",
})
}
if len(items) > 0 {
return &CompletionList{Items: items}
}
return nil
}
func suggestObjects(root *index.ProjectNode, filter string) *CompletionList {
if root == nil {
return nil
}
var items []CompletionItem
var walk func(*index.ProjectNode)
walk = func(node *index.ProjectNode) {
match := false
if filter == "GAM" {
if isGAM(node) {
match = true
}
} else if filter == "DataSource" {
if isDataSource(node) {
match = true
}
}
if match {
items = append(items, CompletionItem{
Label: node.Name,
Kind: 6, // Variable
Detail: node.Metadata["Class"],
})
}
for _, child := range node.Children {
walk(child)
}
}
walk(root)
return &CompletionList{Items: items}
}
func isGAM(node *index.ProjectNode) bool {
if node.RealName == "" || (node.RealName[0] != '+' && node.RealName[0] != '$') {
return false
}
_, hasInput := node.Children["InputSignals"]
_, hasOutput := node.Children["OutputSignals"]
return hasInput || hasOutput
}
func isDataSource(node *index.ProjectNode) bool {
if node.Parent != nil && node.Parent.Name == "Data" {
return true
}
_, hasSignals := node.Children["Signals"]
return hasSignals
}
func HandleDefinition(params DefinitionParams) any {
path := uriToPath(params.TextDocument.URI) path := uriToPath(params.TextDocument.URI)
line := params.Position.Line + 1 line := params.Position.Line + 1
col := params.Position.Character + 1 col := params.Position.Character + 1
res := tree.Query(path, line, col) res := Tree.Query(path, line, col)
if res == nil { if res == nil {
return nil return nil
} }
@@ -454,8 +869,12 @@ func handleDefinition(params DefinitionParams) any {
if res.Reference != nil && res.Reference.Target != nil { if res.Reference != nil && res.Reference.Target != nil {
targetNode = res.Reference.Target targetNode = res.Reference.Target
} else if res.Node != nil { } else if res.Node != nil {
if res.Node.Target != nil {
targetNode = res.Node.Target
} else {
targetNode = res.Node targetNode = res.Node
} }
}
if targetNode != nil { if targetNode != nil {
var locations []Location var locations []Location
@@ -476,12 +895,12 @@ func handleDefinition(params DefinitionParams) any {
return nil return nil
} }
func handleReferences(params ReferenceParams) []Location { func HandleReferences(params ReferenceParams) []Location {
path := uriToPath(params.TextDocument.URI) path := uriToPath(params.TextDocument.URI)
line := params.Position.Line + 1 line := params.Position.Line + 1
col := params.Position.Character + 1 col := params.Position.Character + 1
res := tree.Query(path, line, col) res := Tree.Query(path, line, col)
if res == nil { if res == nil {
return nil return nil
} }
@@ -497,23 +916,30 @@ func handleReferences(params ReferenceParams) []Location {
return nil return nil
} }
// Resolve canonical target (follow link if present)
canonical := targetNode
if targetNode.Target != nil {
canonical = targetNode.Target
}
var locations []Location var locations []Location
if params.Context.IncludeDeclaration { if params.Context.IncludeDeclaration {
for _, frag := range targetNode.Fragments { for _, frag := range canonical.Fragments {
if frag.IsObject { if frag.IsObject {
locations = append(locations, Location{ locations = append(locations, Location{
URI: "file://" + frag.File, URI: "file://" + frag.File,
Range: Range{ Range: Range{
Start: Position{Line: frag.ObjectPos.Line - 1, Character: frag.ObjectPos.Column - 1}, Start: Position{Line: frag.ObjectPos.Line - 1, Character: frag.ObjectPos.Column - 1},
End: Position{Line: frag.ObjectPos.Line - 1, Character: frag.ObjectPos.Column - 1 + len(targetNode.RealName)}, End: Position{Line: frag.ObjectPos.Line - 1, Character: frag.ObjectPos.Column - 1 + len(canonical.RealName)},
}, },
}) })
} }
} }
} }
for _, ref := range tree.References { // 1. References from index (Aliases)
if ref.Target == targetNode { for _, ref := range Tree.References {
if ref.Target == canonical {
locations = append(locations, Location{ locations = append(locations, Location{
URI: "file://" + ref.File, URI: "file://" + ref.File,
Range: Range{ Range: Range{
@@ -524,17 +950,33 @@ func handleReferences(params ReferenceParams) []Location {
} }
} }
// 2. References from Node Targets (Direct References)
Tree.Walk(func(node *index.ProjectNode) {
if node.Target == canonical {
for _, frag := range node.Fragments {
if frag.IsObject {
locations = append(locations, Location{
URI: "file://" + frag.File,
Range: Range{
Start: Position{Line: frag.ObjectPos.Line - 1, Character: frag.ObjectPos.Column - 1},
End: Position{Line: frag.ObjectPos.Line - 1, Character: frag.ObjectPos.Column - 1 + len(node.RealName)},
},
})
}
}
}
})
return locations return locations
} }
func formatNodeInfo(node *index.ProjectNode) string { func formatNodeInfo(node *index.ProjectNode) string {
class := node.Metadata["Class"] info := ""
if class == "" { if class := node.Metadata["Class"]; class != "" {
class = "Unknown" info = fmt.Sprintf("`%s:%s`\n\n", class, node.RealName[1:])
} else {
info = fmt.Sprintf("`%s`\n\n", node.RealName)
} }
info := fmt.Sprintf("**Object**: `%s`\n\n**Class**: `%s`", node.RealName, class)
// Check if it's a Signal (has Type or DataSource) // Check if it's a Signal (has Type or DataSource)
typ := node.Metadata["Type"] typ := node.Metadata["Type"]
ds := node.Metadata["DataSource"] ds := node.Metadata["DataSource"]
@@ -560,6 +1002,57 @@ elems := node.Metadata["NumberOfElements"]
if node.Doc != "" { if node.Doc != "" {
info += fmt.Sprintf("\n\n%s", node.Doc) info += fmt.Sprintf("\n\n%s", node.Doc)
} }
// Find references
var refs []string
for _, ref := range Tree.References {
if ref.Target == node {
container := Tree.GetNodeContaining(ref.File, ref.Position)
if container != nil {
threadName := ""
stateName := ""
curr := container
for curr != nil {
if cls, ok := curr.Metadata["Class"]; ok {
if cls == "RealTimeThread" {
threadName = curr.RealName
}
if cls == "RealTimeState" {
stateName = curr.RealName
}
}
curr = curr.Parent
}
if threadName != "" || stateName != "" {
refStr := ""
if stateName != "" {
refStr += fmt.Sprintf("State: `%s`", stateName)
}
if threadName != "" {
if refStr != "" {
refStr += ", "
}
refStr += fmt.Sprintf("Thread: `%s`", threadName)
}
refs = append(refs, refStr)
}
}
}
}
if len(refs) > 0 {
uniqueRefs := make(map[string]bool)
info += "\n\n**Referenced in**:\n"
for _, r := range refs {
if !uniqueRefs[r] {
uniqueRefs[r] = true
info += fmt.Sprintf("- %s\n", r)
}
}
}
return info return info
} }

View File

@@ -1,210 +0,0 @@
package lsp
import (
"encoding/json"
"os"
"path/filepath"
"strings"
"testing"
"github.com/marte-dev/marte-dev-tools/internal/index"
"github.com/marte-dev/marte-dev-tools/internal/parser"
)
func TestInitProjectScan(t *testing.T) {
// 1. Setup temp dir with files
tmpDir, err := os.MkdirTemp("", "lsp_test")
if err != nil {
t.Fatal(err)
}
defer os.RemoveAll(tmpDir)
// File 1: Definition
if err := os.WriteFile(filepath.Join(tmpDir, "def.marte"), []byte("#package Test.Common\n+Target = { Class = C }"), 0644); err != nil {
t.Fatal(err)
}
// File 2: Reference
// +Source = { Class = C Link = Target }
// Link = Target starts at index ...
// #package Test.Common (21 chars including newline)
// +Source = { Class = C Link = Target }
// 012345678901234567890123456789012345
// Previous offset was 29.
// Now add 21?
// #package Test.Common\n
// +Source = ...
// So add 21 to Character? Or Line 1?
// It's on Line 1 (0-based 1).
if err := os.WriteFile(filepath.Join(tmpDir, "ref.marte"), []byte("#package Test.Common\n+Source = { Class = C Link = Target }"), 0644); err != nil {
t.Fatal(err)
}
// 2. Initialize
tree = index.NewProjectTree() // Reset global tree
initParams := InitializeParams{RootPath: tmpDir}
paramsBytes, _ := json.Marshal(initParams)
msg := &JsonRpcMessage{
Method: "initialize",
Params: paramsBytes,
ID: 1,
}
handleMessage(msg)
// Query the reference in ref.marte at "Target"
// Target starts at index 29 (0-based) on Line 1
defParams := DefinitionParams{
TextDocument: TextDocumentIdentifier{URI: "file://" + filepath.Join(tmpDir, "ref.marte")},
Position: Position{Line: 1, Character: 29},
}
res := handleDefinition(defParams)
if res == nil {
t.Fatal("Definition not found via LSP after initialization")
}
locs, ok := res.([]Location)
if !ok {
t.Fatalf("Expected []Location, got %T", res)
}
if len(locs) == 0 {
t.Fatal("No locations found")
}
// Verify uri points to def.marte
expectedURI := "file://" + filepath.Join(tmpDir, "def.marte")
if locs[0].URI != expectedURI {
t.Errorf("Expected URI %s, got %s", expectedURI, locs[0].URI)
}
}
func TestHandleDefinition(t *testing.T) {
// Reset tree for test
tree = index.NewProjectTree()
content := `
+MyObject = {
Class = Type
}
+RefObject = {
Class = Type
RefField = MyObject
}
`
path := "/test.marte"
p := parser.NewParser(content)
config, err := p.Parse()
if err != nil {
t.Fatalf("Parse failed: %v", err)
}
tree.AddFile(path, config)
tree.ResolveReferences()
t.Logf("Refs: %d", len(tree.References))
for _, r := range tree.References {
t.Logf(" %s at %d:%d", r.Name, r.Position.Line, r.Position.Column)
}
// Test Go to Definition on MyObject reference
params := DefinitionParams{
TextDocument: TextDocumentIdentifier{URI: "file://" + path},
Position: Position{Line: 6, Character: 15}, // "MyObject" in RefField = MyObject
}
result := handleDefinition(params)
if result == nil {
t.Fatal("handleDefinition returned nil")
}
locations, ok := result.([]Location)
if !ok {
t.Fatalf("Expected []Location, got %T", result)
}
if len(locations) != 1 {
t.Fatalf("Expected 1 location, got %d", len(locations))
}
if locations[0].Range.Start.Line != 1 { // +MyObject is on line 2 (0-indexed 1)
t.Errorf("Expected definition on line 1, got %d", locations[0].Range.Start.Line)
}
}
func TestHandleReferences(t *testing.T) {
// Reset tree for test
tree = index.NewProjectTree()
content := `
+MyObject = {
Class = Type
}
+RefObject = {
Class = Type
RefField = MyObject
}
+AnotherRef = {
Ref = MyObject
}
`
path := "/test.marte"
p := parser.NewParser(content)
config, err := p.Parse()
if err != nil {
t.Fatalf("Parse failed: %v", err)
}
tree.AddFile(path, config)
tree.ResolveReferences()
// Test Find References for MyObject (triggered from its definition)
params := ReferenceParams{
TextDocument: TextDocumentIdentifier{URI: "file://" + path},
Position: Position{Line: 1, Character: 1}, // "+MyObject"
Context: ReferenceContext{IncludeDeclaration: true},
}
locations := handleReferences(params)
if len(locations) != 3 { // 1 declaration + 2 references
t.Fatalf("Expected 3 locations, got %d", len(locations))
}
}
func TestLSPFormatting(t *testing.T) {
// Setup
content := `
#package Proj.Main
+Object={
Field=1
}
`
uri := "file:///test.marte"
// Open (populate documents map)
documents[uri] = content
// Format
params := DocumentFormattingParams{
TextDocument: TextDocumentIdentifier{URI: uri},
}
edits := handleFormatting(params)
if len(edits) != 1 {
t.Fatalf("Expected 1 edit, got %d", len(edits))
}
newText := edits[0].NewText
expected := `#package Proj.Main
+Object = {
Field = 1
}
`
// Normalize newlines for comparison just in case
if strings.TrimSpace(strings.ReplaceAll(newText, "\r\n", "\n")) != strings.TrimSpace(strings.ReplaceAll(expected, "\r\n", "\n")) {
t.Errorf("Formatting mismatch.\nExpected:\n%s\nGot:\n%s", expected, newText)
}
}

View File

@@ -22,6 +22,7 @@ const (
TokenPragma TokenPragma
TokenComment TokenComment
TokenDocstring TokenDocstring
TokenComma
) )
type Token struct { type Token struct {
@@ -121,6 +122,8 @@ func (l *Lexer) NextToken() Token {
return l.emit(TokenLBrace) return l.emit(TokenLBrace)
case '}': case '}':
return l.emit(TokenRBrace) return l.emit(TokenRBrace)
case ',':
return l.emit(TokenComma)
case '"': case '"':
return l.lexString() return l.lexString()
case '/': case '/':
@@ -148,7 +151,7 @@ func (l *Lexer) NextToken() Token {
func (l *Lexer) lexIdentifier() Token { func (l *Lexer) lexIdentifier() Token {
for { for {
r := l.next() r := l.next()
if unicode.IsLetter(r) || unicode.IsDigit(r) || r == '_' || r == '-' { if unicode.IsLetter(r) || unicode.IsDigit(r) || r == '_' || r == '-' || r == '.' || r == ':' {
continue continue
} }
l.backup() l.backup()
@@ -186,7 +189,7 @@ func (l *Lexer) lexString() Token {
func (l *Lexer) lexNumber() Token { func (l *Lexer) lexNumber() Token {
for { for {
r := l.next() r := l.next()
if unicode.IsDigit(r) || r == '.' || r == 'x' || r == 'b' || r == 'e' || r == '-' { if unicode.IsDigit(r) || unicode.IsLetter(r) || r == '.' || r == '-' || r == '+' {
continue continue
} }
l.backup() l.backup()
@@ -206,6 +209,20 @@ func (l *Lexer) lexComment() Token {
} }
return l.lexUntilNewline(TokenComment) return l.lexUntilNewline(TokenComment)
} }
if r == '*' {
for {
r := l.next()
if r == -1 {
return l.emit(TokenError)
}
if r == '*' {
if l.peek() == '/' {
l.next() // consume /
return l.emit(TokenComment)
}
}
}
}
l.backup() l.backup()
return l.emit(TokenError) return l.emit(TokenError)
} }

View File

@@ -11,6 +11,7 @@ type Parser struct {
buf []Token buf []Token
comments []Comment comments []Comment
pragmas []Pragma pragmas []Pragma
errors []error
} }
func NewParser(input string) *Parser { func NewParser(input string) *Parser {
@@ -19,6 +20,10 @@ func NewParser(input string) *Parser {
} }
} }
func (p *Parser) addError(pos Position, msg string) {
p.errors = append(p.errors, fmt.Errorf("%d:%d: %s", pos.Line, pos.Column, msg))
}
func (p *Parser) next() Token { func (p *Parser) next() Token {
if len(p.buf) > 0 { if len(p.buf) > 0 {
t := p.buf[0] t := p.buf[0]
@@ -71,72 +76,82 @@ func (p *Parser) Parse() (*Configuration, error) {
continue continue
} }
def, err := p.parseDefinition() def, ok := p.parseDefinition()
if err != nil { if ok {
return nil, err
}
config.Definitions = append(config.Definitions, def) config.Definitions = append(config.Definitions, def)
} else {
// Synchronization: skip token if not consumed to make progress
if p.peek() == tok {
p.next()
}
}
} }
config.Comments = p.comments config.Comments = p.comments
config.Pragmas = p.pragmas config.Pragmas = p.pragmas
return config, nil
var err error
if len(p.errors) > 0 {
err = p.errors[0]
}
return config, err
} }
func (p *Parser) parseDefinition() (Definition, error) { func (p *Parser) parseDefinition() (Definition, bool) {
tok := p.next() tok := p.next()
switch tok.Type { switch tok.Type {
case TokenIdentifier: case TokenIdentifier:
// Could be Field = Value OR Node = { ... }
name := tok.Value name := tok.Value
if p.next().Type != TokenEqual { if p.peek().Type != TokenEqual {
return nil, fmt.Errorf("%d:%d: expected =", tok.Position.Line, tok.Position.Column) p.addError(tok.Position, "expected =")
return nil, false
} }
p.next() // Consume =
// Disambiguate based on RHS
nextTok := p.peek() nextTok := p.peek()
if nextTok.Type == TokenLBrace { if nextTok.Type == TokenLBrace {
// Check if it looks like a Subnode (contains definitions) or Array (contains values)
if p.isSubnodeLookahead() { if p.isSubnodeLookahead() {
sub, err := p.parseSubnode() sub, ok := p.parseSubnode()
if err != nil { if !ok {
return nil, err return nil, false
} }
return &ObjectNode{ return &ObjectNode{
Position: tok.Position, Position: tok.Position,
Name: name, Name: name,
Subnode: sub, Subnode: sub,
}, nil }, true
} }
} }
// Default to Field val, ok := p.parseValue()
val, err := p.parseValue() if !ok {
if err != nil { return nil, false
return nil, err
} }
return &Field{ return &Field{
Position: tok.Position, Position: tok.Position,
Name: name, Name: name,
Value: val, Value: val,
}, nil }, true
case TokenObjectIdentifier: case TokenObjectIdentifier:
// node = subnode
name := tok.Value name := tok.Value
if p.next().Type != TokenEqual { if p.peek().Type != TokenEqual {
return nil, fmt.Errorf("%d:%d: expected =", tok.Position.Line, tok.Position.Column) p.addError(tok.Position, "expected =")
return nil, false
} }
sub, err := p.parseSubnode() p.next() // Consume =
if err != nil {
return nil, err sub, ok := p.parseSubnode()
if !ok {
return nil, false
} }
return &ObjectNode{ return &ObjectNode{
Position: tok.Position, Position: tok.Position,
Name: name, Name: name,
Subnode: sub, Subnode: sub,
}, nil }, true
default: default:
return nil, fmt.Errorf("%d:%d: unexpected token %v", tok.Position.Line, tok.Position.Column, tok.Value) p.addError(tok.Position, fmt.Sprintf("unexpected token %v", tok.Value))
return nil, false
} }
} }
@@ -176,10 +191,11 @@ func (p *Parser) isSubnodeLookahead() bool {
return false return false
} }
func (p *Parser) parseSubnode() (Subnode, error) { func (p *Parser) parseSubnode() (Subnode, bool) {
tok := p.next() tok := p.next()
if tok.Type != TokenLBrace { if tok.Type != TokenLBrace {
return Subnode{}, fmt.Errorf("%d:%d: expected {", tok.Position.Line, tok.Position.Column) p.addError(tok.Position, "expected {")
return Subnode{}, false
} }
sub := Subnode{Position: tok.Position} sub := Subnode{Position: tok.Position}
for { for {
@@ -190,18 +206,23 @@ func (p *Parser) parseSubnode() (Subnode, error) {
break break
} }
if t.Type == TokenEOF { if t.Type == TokenEOF {
return sub, fmt.Errorf("%d:%d: unexpected EOF, expected }", t.Position.Line, t.Position.Column) p.addError(t.Position, "unexpected EOF, expected }")
} sub.EndPosition = t.Position
def, err := p.parseDefinition() return sub, true
if err != nil {
return sub, err
} }
def, ok := p.parseDefinition()
if ok {
sub.Definitions = append(sub.Definitions, def) sub.Definitions = append(sub.Definitions, def)
} else {
if p.peek() == t {
p.next()
} }
return sub, nil }
}
return sub, true
} }
func (p *Parser) parseValue() (Value, error) { func (p *Parser) parseValue() (Value, bool) {
tok := p.next() tok := p.next()
switch tok.Type { switch tok.Type {
case TokenString: case TokenString:
@@ -209,24 +230,21 @@ func (p *Parser) parseValue() (Value, error) {
Position: tok.Position, Position: tok.Position,
Value: strings.Trim(tok.Value, "\""), Value: strings.Trim(tok.Value, "\""),
Quoted: true, Quoted: true,
}, nil }, true
case TokenNumber: case TokenNumber:
// Simplistic handling
if strings.Contains(tok.Value, ".") || strings.Contains(tok.Value, "e") { if strings.Contains(tok.Value, ".") || strings.Contains(tok.Value, "e") {
f, _ := strconv.ParseFloat(tok.Value, 64) f, _ := strconv.ParseFloat(tok.Value, 64)
return &FloatValue{Position: tok.Position, Value: f, Raw: tok.Value}, nil return &FloatValue{Position: tok.Position, Value: f, Raw: tok.Value}, true
} }
i, _ := strconv.ParseInt(tok.Value, 0, 64) i, _ := strconv.ParseInt(tok.Value, 0, 64)
return &IntValue{Position: tok.Position, Value: i, Raw: tok.Value}, nil return &IntValue{Position: tok.Position, Value: i, Raw: tok.Value}, true
case TokenBool: case TokenBool:
return &BoolValue{Position: tok.Position, Value: tok.Value == "true"}, return &BoolValue{Position: tok.Position, Value: tok.Value == "true"},
nil true
case TokenIdentifier: case TokenIdentifier:
// reference? return &ReferenceValue{Position: tok.Position, Value: tok.Value}, true
return &ReferenceValue{Position: tok.Position, Value: tok.Value}, nil
case TokenLBrace: case TokenLBrace:
// array
arr := &ArrayValue{Position: tok.Position} arr := &ArrayValue{Position: tok.Position}
for { for {
t := p.peek() t := p.peek()
@@ -235,14 +253,19 @@ func (p *Parser) parseValue() (Value, error) {
arr.EndPosition = endTok.Position arr.EndPosition = endTok.Position
break break
} }
val, err := p.parseValue() if t.Type == TokenComma {
if err != nil { p.next()
return nil, err continue
}
val, ok := p.parseValue()
if !ok {
return nil, false
} }
arr.Elements = append(arr.Elements, val) arr.Elements = append(arr.Elements, val)
} }
return arr, nil return arr, true
default: default:
return nil, fmt.Errorf("%d:%d: unexpected value token %v", tok.Position.Line, tok.Position.Column, tok.Value) p.addError(tok.Position, fmt.Sprintf("unexpected value token %v", tok.Value))
return nil, false
} }
} }

297
internal/schema/marte.cue Normal file
View File

@@ -0,0 +1,297 @@
package schema
#Classes: {
RealTimeApplication: {
Functions: {...} // type: node
Data!: {...} // type: node
States!: {...} // type: node
...
}
Message: {
...
}
StateMachineEvent: {
NextState!: string
NextStateError!: string
Timeout: uint32
[_= !~"^(Class|NextState|Timeout|NextStateError|[#_$].+)$"]: Message
...
}
_State: {
Class: "ReferenceContainer"
ENTER?: {
Class: "ReferenceContainer"
...
}
[_ = !~"^(Class|ENTER)$"]: StateMachineEvent
...
}
StateMachine: {
[_ = !~"^(Class|[$].*)$"]: _State
...
}
RealTimeState: {
Threads: {...} // type: node
...
}
RealTimeThread: {
Functions: [...] // type: array
...
}
GAMScheduler: {
TimingDataSource: string // type: reference
...
}
TimingDataSource: {
direction: "IN"
...
}
IOGAM: {
InputSignals?: {...} // type: node
OutputSignals?: {...} // type: node
...
}
ReferenceContainer: {
...
}
ConstantGAM: {
...
}
PIDGAM: {
Kp: float | int // type: float (allow int as it promotes)
Ki: float | int
Kd: float | int
...
}
FileDataSource: {
Filename: string
Format?: string
direction: "INOUT"
...
}
LoggerDataSource: {
direction: "OUT"
...
}
DANStream: {
Timeout?: int
direction: "OUT"
...
}
EPICSCAInput: {
direction: "IN"
...
}
EPICSCAOutput: {
direction: "OUT"
...
}
EPICSPVAInput: {
direction: "IN"
...
}
EPICSPVAOutput: {
direction: "OUT"
...
}
SDNSubscriber: {
Address: string
Port: int
Interface?: string
direction: "IN"
...
}
SDNPublisher: {
Address: string
Port: int
Interface?: string
direction: "OUT"
...
}
UDPReceiver: {
Port: int
Address?: string
direction: "IN"
...
}
UDPSender: {
Destination: string
direction: "OUT"
...
}
FileReader: {
Filename: string
Format?: string
Interpolate?: string
direction: "IN"
...
}
FileWriter: {
Filename: string
Format?: string
StoreOnTrigger?: int
direction: "OUT"
...
}
OrderedClass: {
First: int
Second: string
...
}
BaseLib2GAM: {...}
ConversionGAM: {...}
DoubleHandshakeGAM: {...}
FilterGAM: {
Num: [...]
Den: [...]
ResetInEachState?: _
InputSignals?: {...}
OutputSignals?: {...}
...
}
HistogramGAM: {
BeginCycleNumber?: int
StateChangeResetName?: string
InputSignals?: {...}
OutputSignals?: {...}
...
}
Interleaved2FlatGAM: {...}
FlattenedStructIOGAM: {...}
MathExpressionGAM: {
Expression: string
InputSignals?: {...}
OutputSignals?: {...}
...
}
MessageGAM: {...}
MuxGAM: {...}
SimulinkWrapperGAM: {...}
SSMGAM: {...}
StatisticsGAM: {...}
TimeCorrectionGAM: {...}
TriggeredIOGAM: {...}
WaveformGAM: {...}
DAN: {
direction: "OUT"
...
}
LinuxTimer: {
ExecutionMode?: string
SleepNature?: string
SleepPercentage?: _
Phase?: int
CPUMask?: int
TimeProvider?: {...}
Signals: {...}
direction: "IN"
...
}
LinkDataSource: {
direction: "INOUT"
...
}
MDSReader: {
TreeName: string
ShotNumber: int
Frequency: float | int
Signals: {...}
direction: "IN"
...
}
MDSWriter: {
NumberOfBuffers: int
CPUMask: int
StackSize: int
TreeName: string
PulseNumber?: int
StoreOnTrigger: int
EventName: string
TimeRefresh: float | int
NumberOfPreTriggers?: int
NumberOfPostTriggers?: int
Signals: {...}
Messages?: {...}
direction: "OUT"
...
}
NI1588TimeStamp: {
direction: "IN"
...
}
NI6259ADC: {
direction: "IN"
...
}
NI6259DAC: {
direction: "OUT"
...
}
NI6259DIO: {
direction: "INOUT"
...
}
NI6368ADC: {
direction: "IN"
...
}
NI6368DAC: {
direction: "OUT"
...
}
NI6368DIO: {
direction: "INOUT"
...
}
NI9157CircularFifoReader: {
direction: "IN"
...
}
NI9157MxiDataSource: {
direction: "INOUT"
...
}
OPCUADSInput: {
direction: "IN"
...
}
OPCUADSOutput: {
direction: "OUT"
...
}
RealTimeThreadAsyncBridge: {...}
RealTimeThreadSynchronisation: {...}
UARTDataSource: {
direction: "INOUT"
...
}
BaseLib2Wrapper: {...}
EPICSCAClient: {...}
EPICSPVA: {...}
MemoryGate: {...}
OPCUA: {...}
SysLogger: {...}
GAMDataSource: {
direction: "INOUT"
...
}
}
// Definition for any Object.
// It must have a Class field.
// Based on Class, it validates against #Classes.
#Object: {
Class: string
// Allow any other field by default (extensibility),
// unless #Classes definition is closed.
// We allow open structs now.
...
// Unify if Class is known.
// If Class is NOT in #Classes, this might fail or do nothing depending on CUE logic.
// Actually, `#Classes[Class]` fails if key is missing.
// This ensures we validate against known classes.
// If we want to allow unknown classes, we need a check.
// But spec implies validation should check known classes.
#Classes[Class]
}

View File

@@ -1,209 +0,0 @@
{
"classes": {
"RealTimeApplication": {
"fields": [
{"name": "Functions", "type": "node", "mandatory": true},
{"name": "Data", "type": "node", "mandatory": true},
{"name": "States", "type": "node", "mandatory": true}
]
},
"StateMachine": {
"fields": [
{"name": "States", "type": "node", "mandatory": true}
]
},
"GAMScheduler": {
"fields": [
{"name": "TimingDataSource", "type": "reference", "mandatory": true}
]
},
"TimingDataSource": {
"fields": []
},
"IOGAM": {
"fields": [
{"name": "InputSignals", "type": "node", "mandatory": false},
{"name": "OutputSignals", "type": "node", "mandatory": false}
]
},
"ReferenceContainer": {
"fields": []
},
"ConstantGAM": {
"fields": []
},
"PIDGAM": {
"fields": [
{"name": "Kp", "type": "float", "mandatory": true},
{"name": "Ki", "type": "float", "mandatory": true},
{"name": "Kd", "type": "float", "mandatory": true}
]
},
"FileDataSource": {
"fields": [
{"name": "Filename", "type": "string", "mandatory": true},
{"name": "Format", "type": "string", "mandatory": false}
]
},
"LoggerDataSource": {
"fields": []
},
"DANStream": {
"fields": [
{"name": "Timeout", "type": "int", "mandatory": false}
]
},
"EPICSCAInput": {
"fields": []
},
"EPICSCAOutput": {
"fields": []
},
"EPICSPVAInput": {
"fields": []
},
"EPICSPVAOutput": {
"fields": []
},
"SDNSubscriber": {
"fields": [
{"name": "Address", "type": "string", "mandatory": true},
{"name": "Port", "type": "int", "mandatory": true},
{"name": "Interface", "type": "string", "mandatory": false}
]
},
"SDNPublisher": {
"fields": [
{"name": "Address", "type": "string", "mandatory": true},
{"name": "Port", "type": "int", "mandatory": true},
{"name": "Interface", "type": "string", "mandatory": false}
]
},
"UDPReceiver": {
"fields": [
{"name": "Port", "type": "int", "mandatory": true},
{"name": "Address", "type": "string", "mandatory": false}
]
},
"UDPSender": {
"fields": [
{"name": "Destination", "type": "string", "mandatory": true}
]
},
"FileReader": {
"fields": [
{"name": "Filename", "type": "string", "mandatory": true},
{"name": "Format", "type": "string", "mandatory": false},
{"name": "Interpolate", "type": "string", "mandatory": false}
]
},
"FileWriter": {
"fields": [
{"name": "Filename", "type": "string", "mandatory": true},
{"name": "Format", "type": "string", "mandatory": false},
{"name": "StoreOnTrigger", "type": "int", "mandatory": false}
]
},
"OrderedClass": {
"ordered": true,
"fields": [
{"name": "First", "type": "int", "mandatory": true},
{"name": "Second", "type": "string", "mandatory": true}
]
},
"BaseLib2GAM": { "fields": [] },
"ConversionGAM": { "fields": [] },
"DoubleHandshakeGAM": { "fields": [] },
"FilterGAM": {
"fields": [
{"name": "Num", "type": "array", "mandatory": true},
{"name": "Den", "type": "array", "mandatory": true},
{"name": "ResetInEachState", "type": "any", "mandatory": false},
{"name": "InputSignals", "type": "node", "mandatory": false},
{"name": "OutputSignals", "type": "node", "mandatory": false}
]
},
"HistogramGAM": {
"fields": [
{"name": "BeginCycleNumber", "type": "int", "mandatory": false},
{"name": "StateChangeResetName", "type": "string", "mandatory": false},
{"name": "InputSignals", "type": "node", "mandatory": false},
{"name": "OutputSignals", "type": "node", "mandatory": false}
]
},
"Interleaved2FlatGAM": { "fields": [] },
"FlattenedStructIOGAM": { "fields": [] },
"MathExpressionGAM": {
"fields": [
{"name": "Expression", "type": "string", "mandatory": true},
{"name": "InputSignals", "type": "node", "mandatory": false},
{"name": "OutputSignals", "type": "node", "mandatory": false}
]
},
"MessageGAM": { "fields": [] },
"MuxGAM": { "fields": [] },
"SimulinkWrapperGAM": { "fields": [] },
"SSMGAM": { "fields": [] },
"StatisticsGAM": { "fields": [] },
"TimeCorrectionGAM": { "fields": [] },
"TriggeredIOGAM": { "fields": [] },
"WaveformGAM": { "fields": [] },
"DAN": { "fields": [] },
"LinuxTimer": {
"fields": [
{"name": "ExecutionMode", "type": "string", "mandatory": false},
{"name": "SleepNature", "type": "string", "mandatory": false},
{"name": "SleepPercentage", "type": "any", "mandatory": false},
{"name": "Phase", "type": "int", "mandatory": false},
{"name": "CPUMask", "type": "int", "mandatory": false},
{"name": "TimeProvider", "type": "node", "mandatory": false},
{"name": "Signals", "type": "node", "mandatory": true}
]
},
"LinkDataSource": { "fields": [] },
"MDSReader": {
"fields": [
{"name": "TreeName", "type": "string", "mandatory": true},
{"name": "ShotNumber", "type": "int", "mandatory": true},
{"name": "Frequency", "type": "float", "mandatory": true},
{"name": "Signals", "type": "node", "mandatory": true}
]
},
"MDSWriter": {
"fields": [
{"name": "NumberOfBuffers", "type": "int", "mandatory": true},
{"name": "CPUMask", "type": "int", "mandatory": true},
{"name": "StackSize", "type": "int", "mandatory": true},
{"name": "TreeName", "type": "string", "mandatory": true},
{"name": "PulseNumber", "type": "int", "mandatory": false},
{"name": "StoreOnTrigger", "type": "int", "mandatory": true},
{"name": "EventName", "type": "string", "mandatory": true},
{"name": "TimeRefresh", "type": "float", "mandatory": true},
{"name": "NumberOfPreTriggers", "type": "int", "mandatory": false},
{"name": "NumberOfPostTriggers", "type": "int", "mandatory": false},
{"name": "Signals", "type": "node", "mandatory": true},
{"name": "Messages", "type": "node", "mandatory": false}
]
},
"NI1588TimeStamp": { "fields": [] },
"NI6259ADC": { "fields": [] },
"NI6259DAC": { "fields": [] },
"NI6259DIO": { "fields": [] },
"NI6368ADC": { "fields": [] },
"NI6368DAC": { "fields": [] },
"NI6368DIO": { "fields": [] },
"NI9157CircularFifoReader": { "fields": [] },
"NI9157MxiDataSource": { "fields": [] },
"OPCUADSInput": { "fields": [] },
"OPCUADSOutput": { "fields": [] },
"RealTimeThreadAsyncBridge": { "fields": [] },
"RealTimeThreadSynchronisation": { "fields": [] },
"UARTDataSource": { "fields": [] },
"BaseLib2Wrapper": { "fields": [] },
"EPICSCAClient": { "fields": [] },
"EPICSPVA": { "fields": [] },
"MemoryGate": { "fields": [] },
"OPCUA": { "fields": [] },
"SysLogger": { "fields": [] }
}
}

View File

@@ -2,133 +2,73 @@ package schema
import ( import (
_ "embed" _ "embed"
"encoding/json"
"fmt" "fmt"
"os" "os"
"path/filepath" "path/filepath"
"cuelang.org/go/cue"
"cuelang.org/go/cue/cuecontext"
) )
//go:embed marte.json //go:embed marte.cue
var defaultSchemaJSON []byte var defaultSchemaCUE []byte
type Schema struct { type Schema struct {
Classes map[string]ClassDefinition `json:"classes"` Context *cue.Context
} Value cue.Value
type ClassDefinition struct {
Fields []FieldDefinition `json:"fields"`
Ordered bool `json:"ordered"`
}
type FieldDefinition struct {
Name string `json:"name"`
Type string `json:"type"` // "int", "float", "string", "bool", "reference", "array", "node", "any"
Mandatory bool `json:"mandatory"`
} }
func NewSchema() *Schema { func NewSchema() *Schema {
ctx := cuecontext.New()
return &Schema{ return &Schema{
Classes: make(map[string]ClassDefinition), Context: ctx,
Value: ctx.CompileBytes(defaultSchemaCUE),
} }
} }
func LoadSchema(path string) (*Schema, error) { // LoadSchema loads a CUE schema from a file and returns the cue.Value
func LoadSchema(ctx *cue.Context, path string) (cue.Value, error) {
content, err := os.ReadFile(path) content, err := os.ReadFile(path)
if err != nil { if err != nil {
return nil, err return cue.Value{}, err
}
var s Schema
if err := json.Unmarshal(content, &s); err != nil {
return nil, fmt.Errorf("failed to parse schema: %v", err)
}
return &s, nil
}
// DefaultSchema returns the built-in embedded schema
func DefaultSchema() *Schema {
var s Schema
if err := json.Unmarshal(defaultSchemaJSON, &s); err != nil {
panic(fmt.Sprintf("failed to parse default embedded schema: %v", err))
}
if s.Classes == nil {
s.Classes = make(map[string]ClassDefinition)
}
return &s
}
// Merge adds rules from 'other' to 's'.
// Rules for the same class are merged (new fields added, existing fields updated).
func (s *Schema) Merge(other *Schema) {
if other == nil {
return
}
for className, classDef := range other.Classes {
if existingClass, ok := s.Classes[className]; ok {
// Merge fields
fieldMap := make(map[string]FieldDefinition)
for _, f := range classDef.Fields {
fieldMap[f.Name] = f
}
var mergedFields []FieldDefinition
seen := make(map[string]bool)
// Keep existing fields, update if present in other
for _, f := range existingClass.Fields {
if newF, ok := fieldMap[f.Name]; ok {
mergedFields = append(mergedFields, newF)
} else {
mergedFields = append(mergedFields, f)
}
seen[f.Name] = true
}
// Append new fields
for _, f := range classDef.Fields {
if !seen[f.Name] {
mergedFields = append(mergedFields, f)
}
}
existingClass.Fields = mergedFields
if classDef.Ordered {
existingClass.Ordered = true
}
s.Classes[className] = existingClass
} else {
s.Classes[className] = classDef
}
} }
return ctx.CompileBytes(content), nil
} }
func LoadFullSchema(projectRoot string) *Schema { func LoadFullSchema(projectRoot string) *Schema {
s := DefaultSchema() ctx := cuecontext.New()
baseVal := ctx.CompileBytes(defaultSchemaCUE)
if baseVal.Err() != nil {
// Fallback or panic? Panic is appropriate for embedded schema failure
panic(fmt.Sprintf("Embedded schema invalid: %v", baseVal.Err()))
}
// 1. System Paths // 1. System Paths
sysPaths := []string{ sysPaths := []string{
"/usr/share/mdt/marte_schema.json", "/usr/share/mdt/marte_schema.cue",
} }
home, err := os.UserHomeDir() home, err := os.UserHomeDir()
if err == nil { if err == nil {
sysPaths = append(sysPaths, filepath.Join(home, ".local/share/mdt/marte_schema.json")) sysPaths = append(sysPaths, filepath.Join(home, ".local/share/mdt/marte_schema.cue"))
} }
for _, path := range sysPaths { for _, path := range sysPaths {
if sysSchema, err := LoadSchema(path); err == nil { if val, err := LoadSchema(ctx, path); err == nil && val.Err() == nil {
s.Merge(sysSchema) baseVal = baseVal.Unify(val)
} }
} }
// 2. Project Path // 2. Project Path
if projectRoot != "" { if projectRoot != "" {
projectSchemaPath := filepath.Join(projectRoot, ".marte_schema.json") projectSchemaPath := filepath.Join(projectRoot, ".marte_schema.cue")
if projSchema, err := LoadSchema(projectSchemaPath); err == nil { if val, err := LoadSchema(ctx, projectSchemaPath); err == nil && val.Err() == nil {
s.Merge(projSchema) baseVal = baseVal.Unify(val)
} }
} }
return s return &Schema{
Context: ctx,
Value: baseVal,
}
} }

View File

@@ -2,9 +2,15 @@ package validator
import ( import (
"fmt" "fmt"
"github.com/marte-dev/marte-dev-tools/internal/index" "strconv"
"github.com/marte-dev/marte-dev-tools/internal/parser" "strings"
"github.com/marte-dev/marte-dev-tools/internal/schema"
"cuelang.org/go/cue"
"cuelang.org/go/cue/errors"
"github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-community/marte-dev-tools/internal/schema"
) )
type DiagnosticLevel int type DiagnosticLevel int
@@ -38,6 +44,9 @@ func (v *Validator) ValidateProject() {
if v.Tree == nil { if v.Tree == nil {
return return
} }
// Ensure references are resolved (if not already done by builder/lsp)
v.Tree.ResolveReferences()
if v.Tree.Root != nil { if v.Tree.Root != nil {
v.validateNode(v.Tree.Root) v.validateNode(v.Tree.Root)
} }
@@ -47,25 +56,27 @@ func (v *Validator) ValidateProject() {
} }
func (v *Validator) validateNode(node *index.ProjectNode) { func (v *Validator) validateNode(node *index.ProjectNode) {
// Collect fields and their definitions // Check for invalid content in Signals container of DataSource
fields := make(map[string][]*parser.Field) if node.RealName == "Signals" && node.Parent != nil && isDataSource(node.Parent) {
fieldOrder := []string{} // Keep track of order of appearance (approximate across fragments)
for _, frag := range node.Fragments { for _, frag := range node.Fragments {
for _, def := range frag.Definitions { for _, def := range frag.Definitions {
if f, ok := def.(*parser.Field); ok { if f, ok := def.(*parser.Field); ok {
if _, exists := fields[f.Name]; !exists { v.Diagnostics = append(v.Diagnostics, Diagnostic{
fieldOrder = append(fieldOrder, f.Name) Level: LevelError,
Message: fmt.Sprintf("Invalid content in Signals container: Field '%s' is not allowed. Only Signal objects are allowed.", f.Name),
Position: f.Position,
File: frag.File,
})
} }
fields[f.Name] = append(fields[f.Name], f)
} }
} }
} }
// 1. Check for duplicate fields fields := v.getFields(node)
// 1. Check for duplicate fields (Go logic)
for name, defs := range fields { for name, defs := range fields {
if len(defs) > 1 { if len(defs) > 1 {
// Report error on the second definition
firstFile := v.getFileForField(defs[0], node) firstFile := v.getFileForField(defs[0], node)
v.Diagnostics = append(v.Diagnostics, Diagnostic{ v.Diagnostics = append(v.Diagnostics, Diagnostic{
Level: LevelError, Level: LevelError,
@@ -80,13 +91,7 @@ func (v *Validator) validateNode(node *index.ProjectNode) {
className := "" className := ""
if node.RealName != "" && (node.RealName[0] == '+' || node.RealName[0] == '$') { if node.RealName != "" && (node.RealName[0] == '+' || node.RealName[0] == '$') {
if classFields, ok := fields["Class"]; ok && len(classFields) > 0 { if classFields, ok := fields["Class"]; ok && len(classFields) > 0 {
// Extract class name from value className = v.getFieldValue(classFields[0])
switch val := classFields[0].Value.(type) {
case *parser.StringValue:
className = val.Value
case *parser.ReferenceValue:
className = val.Value
}
} }
hasType := false hasType := false
@@ -104,13 +109,25 @@ func (v *Validator) validateNode(node *index.ProjectNode) {
File: file, File: file,
}) })
} }
if className == "RealTimeThread" {
v.checkFunctionsArray(node, fields)
}
} }
// 3. Schema Validation // 3. CUE Validation
if className != "" && v.Schema != nil { if className != "" && v.Schema != nil {
if classDef, ok := v.Schema.Classes[className]; ok { v.validateWithCUE(node, className)
v.validateClass(node, classDef, fields, fieldOrder)
} }
// 4. Signal Validation (for DataSource signals)
if isSignal(node) {
v.validateSignal(node, fields)
}
// 5. GAM Validation (Signal references)
if isGAM(node) {
v.validateGAM(node)
} }
// Recursively validate children // Recursively validate children
@@ -119,113 +136,414 @@ func (v *Validator) validateNode(node *index.ProjectNode) {
} }
} }
func (v *Validator) validateClass(node *index.ProjectNode, classDef schema.ClassDefinition, fields map[string][]*parser.Field, fieldOrder []string) { func (v *Validator) validateWithCUE(node *index.ProjectNode, className string) {
// Check Mandatory Fields // Check if class exists in schema
for _, fieldDef := range classDef.Fields { classPath := cue.ParsePath(fmt.Sprintf("#Classes.%s", className))
if fieldDef.Mandatory { if v.Schema.Value.LookupPath(classPath).Err() != nil {
found := false return // Unknown class, skip validation
if _, ok := fields[fieldDef.Name]; ok { }
found = true
} else if fieldDef.Type == "node" { // Convert node to map
// Check children for nodes data := v.nodeToMap(node)
if _, ok := node.Children[fieldDef.Name]; ok {
found = true // Encode data to CUE
dataVal := v.Schema.Context.Encode(data)
// Unify with #Object
// #Object requires "Class" field, which is present in data.
objDef := v.Schema.Value.LookupPath(cue.ParsePath("#Object"))
// Unify
res := objDef.Unify(dataVal)
if err := res.Validate(cue.Concrete(true)); err != nil {
// Report errors
// Parse CUE error to diagnostic
v.reportCUEError(err, node)
} }
} }
if !found { func (v *Validator) reportCUEError(err error, node *index.ProjectNode) {
list := errors.Errors(err)
for _, e := range list {
msg := e.Error()
v.Diagnostics = append(v.Diagnostics, Diagnostic{ v.Diagnostics = append(v.Diagnostics, Diagnostic{
Level: LevelError, Level: LevelError,
Message: fmt.Sprintf("Missing mandatory field '%s' for class '%s'", fieldDef.Name, node.Metadata["Class"]), Message: fmt.Sprintf("Schema Validation Error: %v", msg),
Position: v.getNodePosition(node), Position: v.getNodePosition(node),
File: v.getNodeFile(node), File: v.getNodeFile(node),
}) })
} }
} }
}
// Check Field Types func (v *Validator) nodeToMap(node *index.ProjectNode) map[string]interface{} {
for _, fieldDef := range classDef.Fields { m := make(map[string]interface{})
if fList, ok := fields[fieldDef.Name]; ok { fields := v.getFields(node)
f := fList[0] // Check the first definition (duplicates handled elsewhere)
if !v.checkType(f.Value, fieldDef.Type) { for name, defs := range fields {
v.Diagnostics = append(v.Diagnostics, Diagnostic{ if len(defs) > 0 {
Level: LevelError, // Use the last definition (duplicates checked elsewhere)
Message: fmt.Sprintf("Field '%s' expects type '%s'", fieldDef.Name, fieldDef.Type), m[name] = v.valueToInterface(defs[len(defs)-1].Value)
Position: f.Position,
File: v.getFileForField(f, node),
})
}
} }
} }
// Check Field Order // Children as nested maps?
if classDef.Ordered { // CUE schema expects nested structs for "node" type fields.
// Verify that fields present in the node appear in the order defined in the schema // But `node.Children` contains ALL children (even those defined as +Child).
// Only consider fields that are actually in the schema's field list // If schema expects `States: { ... }`, we map children.
schemaIdx := 0
for _, nodeFieldName := range fieldOrder { for name, child := range node.Children {
// Find this field in schema // normalize name? CUE keys are strings.
foundInSchema := false // If child real name is "+States", key in Children is "States".
for i, fd := range classDef.Fields { // We use "States" as key in map.
if fd.Name == nodeFieldName { m[name] = v.nodeToMap(child)
foundInSchema = true }
// Check if this field appears AFTER the current expected position
if i < schemaIdx { return m
// This field appears out of order (it should have appeared earlier, or previous fields were missing but this one came too late? No, simple relative order) }
// Actually, simple check: `i` must be >= `lastSeenSchemaIdx`.
func (v *Validator) valueToInterface(val parser.Value) interface{} {
switch t := val.(type) {
case *parser.StringValue:
return t.Value
case *parser.IntValue:
i, _ := strconv.ParseInt(t.Raw, 0, 64)
return i // CUE handles int64
case *parser.FloatValue:
f, _ := strconv.ParseFloat(t.Raw, 64)
return f
case *parser.BoolValue:
return t.Value
case *parser.ReferenceValue:
return t.Value
case *parser.ArrayValue:
var arr []interface{}
for _, e := range t.Elements {
arr = append(arr, v.valueToInterface(e))
}
return arr
}
return nil
}
func (v *Validator) validateSignal(node *index.ProjectNode, fields map[string][]*parser.Field) {
// ... (same as before)
if typeFields, ok := fields["Type"]; !ok || len(typeFields) == 0 {
v.Diagnostics = append(v.Diagnostics, Diagnostic{ v.Diagnostics = append(v.Diagnostics, Diagnostic{
Level: LevelError, Level: LevelError,
Message: fmt.Sprintf("Field '%s' is out of order", nodeFieldName), Message: fmt.Sprintf("Signal '%s' is missing mandatory field 'Type'", node.RealName),
Position: fields[nodeFieldName][0].Position, Position: v.getNodePosition(node),
File: v.getFileForField(fields[nodeFieldName][0], node), File: v.getNodeFile(node),
}) })
} else { } else {
schemaIdx = i typeVal := typeFields[0].Value
var typeStr string
switch t := typeVal.(type) {
case *parser.StringValue:
typeStr = t.Value
case *parser.ReferenceValue:
typeStr = t.Value
default:
v.Diagnostics = append(v.Diagnostics, Diagnostic{
Level: LevelError,
Message: fmt.Sprintf("Field 'Type' in Signal '%s' must be a type name", node.RealName),
Position: typeFields[0].Position,
File: v.getFileForField(typeFields[0], node),
})
return
} }
if !isValidType(typeStr) {
v.Diagnostics = append(v.Diagnostics, Diagnostic{
Level: LevelError,
Message: fmt.Sprintf("Invalid Type '%s' for Signal '%s'", typeStr, node.RealName),
Position: typeFields[0].Position,
File: v.getFileForField(typeFields[0], node),
})
}
}
}
func (v *Validator) validateGAM(node *index.ProjectNode) {
if inputs, ok := node.Children["InputSignals"]; ok {
v.validateGAMSignals(node, inputs, "Input")
}
if outputs, ok := node.Children["OutputSignals"]; ok {
v.validateGAMSignals(node, outputs, "Output")
}
}
func (v *Validator) validateGAMSignals(gamNode, signalsContainer *index.ProjectNode, direction string) {
for _, signal := range signalsContainer.Children {
v.validateGAMSignal(gamNode, signal, direction)
}
}
func (v *Validator) validateGAMSignal(gamNode, signalNode *index.ProjectNode, direction string) {
fields := v.getFields(signalNode)
var dsName string
if dsFields, ok := fields["DataSource"]; ok && len(dsFields) > 0 {
dsName = v.getFieldValue(dsFields[0])
}
if dsName == "" {
return // Ignore implicit signals or missing datasource (handled elsewhere if mandatory)
}
dsNode := v.resolveReference(dsName, v.getNodeFile(signalNode), isDataSource)
if dsNode == nil {
v.Diagnostics = append(v.Diagnostics, Diagnostic{
Level: LevelError,
Message: fmt.Sprintf("Unknown DataSource '%s' referenced in signal '%s'", dsName, signalNode.RealName),
Position: v.getNodePosition(signalNode),
File: v.getNodeFile(signalNode),
})
return
}
// Link DataSource reference
if dsFields, ok := fields["DataSource"]; ok && len(dsFields) > 0 {
if val, ok := dsFields[0].Value.(*parser.ReferenceValue); ok {
v.updateReferenceTarget(v.getNodeFile(signalNode), val.Position, dsNode)
}
}
// Check Direction using CUE Schema
dsClass := v.getNodeClass(dsNode)
if dsClass != "" {
// Lookup class definition in Schema
// path: #Classes.ClassName.direction
path := cue.ParsePath(fmt.Sprintf("#Classes.%s.direction", dsClass))
val := v.Schema.Value.LookupPath(path)
if val.Err() == nil {
dsDir, err := val.String()
if err == nil && dsDir != "" {
if direction == "Input" && dsDir == "OUT" {
v.Diagnostics = append(v.Diagnostics, Diagnostic{
Level: LevelError,
Message: fmt.Sprintf("DataSource '%s' (Class %s) is Output-only but referenced in InputSignals of GAM '%s'", dsName, dsClass, gamNode.RealName),
Position: v.getNodePosition(signalNode),
File: v.getNodeFile(signalNode),
})
}
if direction == "Output" && dsDir == "IN" {
v.Diagnostics = append(v.Diagnostics, Diagnostic{
Level: LevelError,
Message: fmt.Sprintf("DataSource '%s' (Class %s) is Input-only but referenced in OutputSignals of GAM '%s'", dsName, dsClass, gamNode.RealName),
Position: v.getNodePosition(signalNode),
File: v.getNodeFile(signalNode),
})
}
}
}
}
// Check Signal Existence
targetSignalName := index.NormalizeName(signalNode.RealName)
if aliasFields, ok := fields["Alias"]; ok && len(aliasFields) > 0 {
targetSignalName = v.getFieldValue(aliasFields[0]) // Alias is usually the name in DataSource
}
var targetNode *index.ProjectNode
if signalsContainer, ok := dsNode.Children["Signals"]; ok {
targetNorm := index.NormalizeName(targetSignalName)
if child, ok := signalsContainer.Children[targetNorm]; ok {
targetNode = child
} else {
// Fallback check
for _, child := range signalsContainer.Children {
if index.NormalizeName(child.RealName) == targetNorm {
targetNode = child
break break
} }
} }
if !foundInSchema { }
// Ignore extra fields for order check? Spec doesn't say strict closed schema. }
if targetNode == nil {
suppressed := v.isGloballyAllowed("implicit", v.getNodeFile(signalNode))
if !suppressed {
for _, p := range signalNode.Pragmas {
if strings.HasPrefix(p, "implicit:") || strings.HasPrefix(p, "ignore(implicit)") {
suppressed = true
break
}
}
}
if !suppressed {
v.Diagnostics = append(v.Diagnostics, Diagnostic{
Level: LevelWarning,
Message: fmt.Sprintf("Implicitly Defined Signal: '%s' is defined in GAM '%s' but not in DataSource '%s'", targetSignalName, gamNode.RealName, dsName),
Position: v.getNodePosition(signalNode),
File: v.getNodeFile(signalNode),
})
}
if typeFields, ok := fields["Type"]; !ok || len(typeFields) == 0 {
v.Diagnostics = append(v.Diagnostics, Diagnostic{
Level: LevelError,
Message: fmt.Sprintf("Implicit signal '%s' must define Type", targetSignalName),
Position: v.getNodePosition(signalNode),
File: v.getNodeFile(signalNode),
})
} else {
// Check Type validity even for implicit
typeVal := v.getFieldValue(typeFields[0])
if !isValidType(typeVal) {
v.Diagnostics = append(v.Diagnostics, Diagnostic{
Level: LevelError,
Message: fmt.Sprintf("Invalid Type '%s' for Signal '%s'", typeVal, signalNode.RealName),
Position: typeFields[0].Position,
File: v.getNodeFile(signalNode),
})
}
}
} else {
signalNode.Target = targetNode
// Link Alias reference
if aliasFields, ok := fields["Alias"]; ok && len(aliasFields) > 0 {
if val, ok := aliasFields[0].Value.(*parser.ReferenceValue); ok {
v.updateReferenceTarget(v.getNodeFile(signalNode), val.Position, targetNode)
}
}
// Property checks
v.checkSignalProperty(signalNode, targetNode, "Type")
v.checkSignalProperty(signalNode, targetNode, "NumberOfElements")
v.checkSignalProperty(signalNode, targetNode, "NumberOfDimensions")
// Check Type validity if present
if typeFields, ok := fields["Type"]; ok && len(typeFields) > 0 {
typeVal := v.getFieldValue(typeFields[0])
if !isValidType(typeVal) {
v.Diagnostics = append(v.Diagnostics, Diagnostic{
Level: LevelError,
Message: fmt.Sprintf("Invalid Type '%s' for Signal '%s'", typeVal, signalNode.RealName),
Position: typeFields[0].Position,
File: v.getNodeFile(signalNode),
})
} }
} }
} }
} }
func (v *Validator) checkType(val parser.Value, expectedType string) bool { func (v *Validator) checkSignalProperty(gamSig, dsSig *index.ProjectNode, prop string) {
switch expectedType { gamVal := gamSig.Metadata[prop]
case "int": dsVal := dsSig.Metadata[prop]
_, ok := val.(*parser.IntValue)
return ok if gamVal == "" {
case "float": return
_, ok := val.(*parser.FloatValue) }
return ok
case "string": if dsVal != "" && gamVal != dsVal {
_, ok := val.(*parser.StringValue) if prop == "Type" {
return ok if v.checkCastPragma(gamSig, dsVal, gamVal) {
case "bool": return
_, ok := val.(*parser.BoolValue) }
return ok }
case "array":
_, ok := val.(*parser.ArrayValue) v.Diagnostics = append(v.Diagnostics, Diagnostic{
return ok Level: LevelError,
case "reference": Message: fmt.Sprintf("Signal '%s' property '%s' mismatch: defined '%s', referenced '%s'", gamSig.RealName, prop, dsVal, gamVal),
_, ok := val.(*parser.ReferenceValue) Position: v.getNodePosition(gamSig),
return ok File: v.getNodeFile(gamSig),
case "node": })
// This is tricky. A field cannot really be a "node" type in the parser sense (Node = { ... } is an ObjectNode, not a Field). }
// But if the schema says "FieldX" is type "node", maybe it means it expects a reference to a node? }
// Or maybe it means it expects a Subnode?
// In MARTe, `Field = { ... }` is parsed as ArrayValue usually. func (v *Validator) checkCastPragma(node *index.ProjectNode, defType, curType string) bool {
// If `Field = SubNode`, it's `ObjectNode`. for _, p := range node.Pragmas {
// Schema likely refers to `+SubNode = { ... }`. if strings.HasPrefix(p, "cast(") {
// But `validateClass` iterates `fields`. content := strings.TrimPrefix(p, "cast(")
// If schema defines a "field" of type "node", it might mean it expects a child node with that name. if idx := strings.Index(content, ")"); idx != -1 {
return true // skip for now content = content[:idx]
case "any": parts := strings.Split(content, ",")
if len(parts) == 2 {
d := strings.TrimSpace(parts[0])
c := strings.TrimSpace(parts[1])
if d == defType && c == curType {
return true return true
} }
}
}
}
}
return false
}
func (v *Validator) updateReferenceTarget(file string, pos parser.Position, target *index.ProjectNode) {
for i := range v.Tree.References {
ref := &v.Tree.References[i]
if ref.File == file && ref.Position == pos {
ref.Target = target
return
}
}
}
// Helpers
func (v *Validator) getFields(node *index.ProjectNode) map[string][]*parser.Field {
fields := make(map[string][]*parser.Field)
for _, frag := range node.Fragments {
for _, def := range frag.Definitions {
if f, ok := def.(*parser.Field); ok {
fields[f.Name] = append(fields[f.Name], f)
}
}
}
return fields
}
func (v *Validator) getFieldValue(f *parser.Field) string {
switch val := f.Value.(type) {
case *parser.StringValue:
return val.Value
case *parser.ReferenceValue:
return val.Value
case *parser.IntValue:
return val.Raw
case *parser.FloatValue:
return val.Raw
}
return ""
}
func (v *Validator) resolveReference(name string, file string, predicate func(*index.ProjectNode) bool) *index.ProjectNode {
if isoNode, ok := v.Tree.IsolatedFiles[file]; ok {
if found := v.Tree.FindNode(isoNode, name, predicate); found != nil {
return found
}
return nil
}
if v.Tree.Root == nil {
return nil
}
return v.Tree.FindNode(v.Tree.Root, name, predicate)
}
func (v *Validator) getNodeClass(node *index.ProjectNode) string {
if cls, ok := node.Metadata["Class"]; ok {
return cls
}
return ""
}
func isValidType(t string) bool {
switch t {
case "uint8", "int8", "uint16", "int16", "uint32", "int32", "uint64", "int64",
"float32", "float64", "string", "bool", "char8":
return true
}
return false
}
func (v *Validator) checkType(val parser.Value, expectedType string) bool {
// Legacy function, replaced by CUE.
return true return true
} }
@@ -248,6 +566,13 @@ func (v *Validator) CheckUnused() {
} }
} }
if v.Tree.Root != nil {
v.collectTargetUsage(v.Tree.Root, referencedNodes)
}
for _, node := range v.Tree.IsolatedFiles {
v.collectTargetUsage(node, referencedNodes)
}
if v.Tree.Root != nil { if v.Tree.Root != nil {
v.checkUnusedRecursive(v.Tree.Root, referencedNodes) v.checkUnusedRecursive(v.Tree.Root, referencedNodes)
} }
@@ -256,10 +581,29 @@ func (v *Validator) CheckUnused() {
} }
} }
func (v *Validator) collectTargetUsage(node *index.ProjectNode, referenced map[*index.ProjectNode]bool) {
if node.Target != nil {
referenced[node.Target] = true
}
for _, child := range node.Children {
v.collectTargetUsage(child, referenced)
}
}
func (v *Validator) checkUnusedRecursive(node *index.ProjectNode, referenced map[*index.ProjectNode]bool) { func (v *Validator) checkUnusedRecursive(node *index.ProjectNode, referenced map[*index.ProjectNode]bool) {
// Heuristic for GAM // Heuristic for GAM
if isGAM(node) { if isGAM(node) {
if !referenced[node] { if !referenced[node] {
suppress := v.isGloballyAllowed("unused", v.getNodeFile(node))
if !suppress {
for _, p := range node.Pragmas {
if strings.HasPrefix(p, "unused:") || strings.HasPrefix(p, "ignore(unused)") {
suppress = true
break
}
}
}
if !suppress {
v.Diagnostics = append(v.Diagnostics, Diagnostic{ v.Diagnostics = append(v.Diagnostics, Diagnostic{
Level: LevelWarning, Level: LevelWarning,
Message: fmt.Sprintf("Unused GAM: %s is defined but not referenced in any thread or scheduler", node.RealName), Message: fmt.Sprintf("Unused GAM: %s is defined but not referenced in any thread or scheduler", node.RealName),
@@ -268,11 +612,24 @@ func (v *Validator) checkUnusedRecursive(node *index.ProjectNode, referenced map
}) })
} }
} }
}
// Heuristic for DataSource and its signals // Heuristic for DataSource and its signals
if isDataSource(node) { if isDataSource(node) {
for _, signal := range node.Children { if signalsNode, ok := node.Children["Signals"]; ok {
for _, signal := range signalsNode.Children {
if !referenced[signal] { if !referenced[signal] {
if v.isGloballyAllowed("unused", v.getNodeFile(signal)) {
continue
}
suppress := false
for _, p := range signal.Pragmas {
if strings.HasPrefix(p, "unused:") || strings.HasPrefix(p, "ignore(unused)") {
suppress = true
break
}
}
if !suppress {
v.Diagnostics = append(v.Diagnostics, Diagnostic{ v.Diagnostics = append(v.Diagnostics, Diagnostic{
Level: LevelWarning, Level: LevelWarning,
Message: fmt.Sprintf("Unused Signal: %s is defined in DataSource %s but never referenced", signal.RealName, node.RealName), Message: fmt.Sprintf("Unused Signal: %s is defined in DataSource %s but never referenced", signal.RealName, node.RealName),
@@ -282,6 +639,8 @@ func (v *Validator) checkUnusedRecursive(node *index.ProjectNode, referenced map
} }
} }
} }
}
}
for _, child := range node.Children { for _, child := range node.Children {
v.checkUnusedRecursive(child, referenced) v.checkUnusedRecursive(child, referenced)
@@ -301,6 +660,16 @@ func isDataSource(node *index.ProjectNode) bool {
if node.Parent != nil && node.Parent.Name == "Data" { if node.Parent != nil && node.Parent.Name == "Data" {
return true return true
} }
_, hasSignals := node.Children["Signals"]
return hasSignals
}
func isSignal(node *index.ProjectNode) bool {
if node.Parent != nil && node.Parent.Name == "Signals" {
if isDataSource(node.Parent.Parent) {
return true
}
}
return false return false
} }
@@ -317,3 +686,63 @@ func (v *Validator) getNodeFile(node *index.ProjectNode) string {
} }
return "" return ""
} }
func (v *Validator) checkFunctionsArray(node *index.ProjectNode, fields map[string][]*parser.Field) {
if funcs, ok := fields["Functions"]; ok && len(funcs) > 0 {
f := funcs[0]
if arr, ok := f.Value.(*parser.ArrayValue); ok {
for _, elem := range arr.Elements {
if ref, ok := elem.(*parser.ReferenceValue); ok {
target := v.resolveReference(ref.Value, v.getNodeFile(node), isGAM)
if target == nil {
v.Diagnostics = append(v.Diagnostics, Diagnostic{
Level: LevelError,
Message: fmt.Sprintf("Function '%s' not found or is not a valid GAM", ref.Value),
Position: ref.Position,
File: v.getNodeFile(node),
})
}
} else {
v.Diagnostics = append(v.Diagnostics, Diagnostic{
Level: LevelError,
Message: "Functions array must contain references",
Position: f.Position,
File: v.getNodeFile(node),
})
}
}
}
}
}
func (v *Validator) isGloballyAllowed(warningType string, contextFile string) bool {
prefix1 := fmt.Sprintf("allow(%s)", warningType)
prefix2 := fmt.Sprintf("ignore(%s)", warningType)
// If context file is isolated, only check its own pragmas
if _, isIsolated := v.Tree.IsolatedFiles[contextFile]; isIsolated {
if pragmas, ok := v.Tree.GlobalPragmas[contextFile]; ok {
for _, p := range pragmas {
normalized := strings.ReplaceAll(p, " ", "")
if strings.HasPrefix(normalized, prefix1) || strings.HasPrefix(normalized, prefix2) {
return true
}
}
}
return false
}
// If project file, check all non-isolated files
for file, pragmas := range v.Tree.GlobalPragmas {
if _, isIsolated := v.Tree.IsolatedFiles[file]; isIsolated {
continue
}
for _, p := range pragmas {
normalized := strings.ReplaceAll(p, " ", "")
if strings.HasPrefix(normalized, prefix1) || strings.HasPrefix(normalized, prefix2) {
return true
}
}
}
return false
}

BIN
mdt

Binary file not shown.

View File

@@ -29,7 +29,12 @@ The LSP server should provide the following capabilities:
- **Go to Definition**: Jump to the definition of a reference, supporting navigation across any file in the current project. - **Go to Definition**: Jump to the definition of a reference, supporting navigation across any file in the current project.
- **Go to References**: Find usages of a node or field, supporting navigation across any file in the current project. - **Go to References**: Find usages of a node or field, supporting navigation across any file in the current project.
- **Code Completion**: Autocomplete fields, values, and references. - **Code Completion**: Autocomplete fields, values, and references.
- **Code Snippets**: Provide snippets for common patterns. - **Context-Aware**: Suggestions depend on the cursor position (e.g., inside an object, assigning a value).
- **Schema-Driven**: Field suggestions are derived from the CUE schema for the current object's Class, indicating mandatory vs. optional fields.
- **Reference Suggestions**:
- `DataSource` fields suggest available DataSource objects.
- `Functions` (in Threads) suggest available GAM objects.
- **Code Snippets**: Provide snippets for common patterns (e.g., `+Object = { ... }`).
- **Formatting**: Format the document using the same rules and engine as the `fmt` command. - **Formatting**: Format the document using the same rules and engine as the `fmt` command.
## Build System & File Structure ## Build System & File Structure
@@ -47,9 +52,9 @@ The LSP server should provide the following capabilities:
- **Namespace Consistency**: The build tool must verify that all input files belong to the same project namespace (the first segment of the `#package` URI). If multiple project namespaces are detected, the build must fail with an error. - **Namespace Consistency**: The build tool must verify that all input files belong to the same project namespace (the first segment of the `#package` URI). If multiple project namespaces are detected, the build must fail with an error.
- **Target**: The build output is written to a single target file (e.g., provided via CLI or API). - **Target**: The build output is written to a single target file (e.g., provided via CLI or API).
- **Multi-File Definitions**: Nodes and objects can be defined across multiple files. The build tool, validator, and LSP must merge these definitions (including all fields and sub-nodes) from the entire project to create a unified view before processing or validating. - **Multi-File Definitions**: Nodes and objects can be defined across multiple files. The build tool, validator, and LSP must merge these definitions (including all fields and sub-nodes) from the entire project to create a unified view before processing or validating.
- **Global References**: References to nodes, signals, or objects can point to definitions located in any file within the project. - **Global References**: References to nodes, signals, or objects can point to definitions located in any file within the project. Support for dot-separated paths (e.g., `Node.SubNode`) is required.
- **Merging Order**: For objects defined across multiple files, the **first file** to be considered is the one containing the `Class` field definition. - **Merging Order**: For objects defined across multiple files, definitions are merged. The build tool must preserve the relative order of fields and sub-nodes as they appear in the source files, interleaving them correctly in the final output.
- **Field Order**: Within a single file, the relative order of defined fields must be maintained. - **Field Order**: Within a single file (and across merged files), the relative order of defined fields must be maintained in the output.
- The LSP indexes only files belonging to the same project/namespace scope. - The LSP indexes only files belonging to the same project/namespace scope.
- **Output**: The output format is the same as the input configuration but without the `#package` macro. - **Output**: The output format is the same as the input configuration but without the `#package` macro.
@@ -84,8 +89,13 @@ The LSP server should provide the following capabilities:
- **Nodes (`+` / `$`)**: The prefixes `+` and `$` indicate that the node represents an object. - **Nodes (`+` / `$`)**: The prefixes `+` and `$` indicate that the node represents an object.
- **Constraint**: These nodes _must_ contain a field named `Class` within their subnode definition (across all files where the node is defined). - **Constraint**: These nodes _must_ contain a field named `Class` within their subnode definition (across all files where the node is defined).
- **Signals**: Signals are considered nodes but **not** objects. They do not require a `Class` field. - **Signals**: Signals are considered nodes but **not** objects. They do not require a `Class` field.
- **Pragmas (`//!`)**: Used to suppress specific diagnostics. The developer can use these to explain why a rule is being ignored. - **Pragmas (`//!`)**: Used to suppress specific diagnostics. The developer can use these to explain why a rule is being ignored. Supported pragmas:
- `//!unused: REASON` or `//!ignore(unused): REASON` - Suppress "Unused GAM" or "Unused Signal" warnings.
- `//!implicit: REASON` or `//!ignore(implicit): REASON` - Suppress "Implicitly Defined Signal" warnings.
- `//!allow(WARNING_TYPE): REASON` or `//!ignore(WARNING_TYPE): REASON` - Global suppression for a specific warning type across the whole project (supported: `unused`, `implicit`).
- `//!cast(DEF_TYPE, CUR_TYPE): REASON` - Suppress "Type Inconsistency" errors if types match.
- **Structure**: A configuration is composed by one or more definitions. - **Structure**: A configuration is composed by one or more definitions.
- **Strictness**: Any content that is not a valid comment (or pragma/docstring) or a valid definition (Field, Node, or Object) is **not allowed** and must generate a parsing error.
### Core MARTe Classes ### Core MARTe Classes
@@ -105,29 +115,33 @@ MARTe configurations typically involve several main categories of objects:
- **Requirements**: - **Requirements**:
- All signal definitions **must** include a `Type` field with a valid value. - All signal definitions **must** include a `Type` field with a valid value.
- **Size Information**: Signals can optionally include `NumberOfDimensions` and `NumberOfElements` fields. If not explicitly defined, these default to `1`. - **Size Information**: Signals can optionally include `NumberOfDimensions` and `NumberOfElements` fields. If not explicitly defined, these default to `1`.
- **Property Matching**: Signal references in GAMs must match the properties (`Type`, `NumberOfElements`, `NumberOfDimensions`) of the defined signal in the `DataSource`.
- **Extensibility**: Signal definitions can include additional fields as required by the specific application context. - **Extensibility**: Signal definitions can include additional fields as required by the specific application context.
- **Signal Reference Syntax**: - **Signal Reference Syntax**:
- Signals are referenced or defined in `InputSignals` or `OutputSignals` sub-nodes using one of the following formats: - Signals are referenced or defined in `InputSignals` or `OutputSignals` sub-nodes using one of the following formats:
1. **Direct Reference**: 1. **Direct Reference (Option 1)**:
``` ```
SIGNAL_NAME = { SIGNAL_NAME = {
DataSource = SIGNAL_DATASOURCE DataSource = DATASOURCE_NAME
// Other fields if necessary // Other fields if necessary
} }
``` ```
2. **Aliased Reference**: In this case, the GAM signal name is the same as the DataSource signal name.
2. **Aliased Reference (Option 2)**:
``` ```
NAME = { GAM_SIGNAL_NAME = {
Alias = SIGNAL_NAME Alias = SIGNAL_NAME
DataSource = SIGNAL_DATASOURCE DataSource = DATASOURCE_NAME
// ... // ...
} }
``` ```
In this case, `Alias` points to the DataSource signal name.
- **Implicit Definition Constraint**: If a signal is implicitly defined within a GAM, the `Type` field **must** be present in the reference block to define the signal's properties. - **Implicit Definition Constraint**: If a signal is implicitly defined within a GAM, the `Type` field **must** be present in the reference block to define the signal's properties.
- **Directionality**: DataSources and their signals are directional: - **Directionality**: DataSources and their signals are directional:
- `Input`: Only providing data. - `Input` (IN): Only providing data. Signals can only be used in `InputSignals`.
- `Output`: Only receiving data. - `Output` (OUT): Only receiving data. Signals can only be used in `OutputSignals`.
- `Inout`: Bidirectional data flow. - `Inout` (INOUT): Bidirectional data flow. Signals can be used in both `InputSignals` and `OutputSignals`.
- **Validation**: The tool must validate that signal usage in GAMs respects the direction of the referenced DataSource.
### Object Indexing & References ### Object Indexing & References
@@ -151,13 +165,13 @@ The tool must build an index of the configuration to support LSP features and va
- **Field Order**: Verification that specific fields appear in a prescribed order when required by the class definition. - **Field Order**: Verification that specific fields appear in a prescribed order when required by the class definition.
- **Conditional Fields**: Validation of fields whose presence or value depends on the values of other fields within the same node or context. - **Conditional Fields**: Validation of fields whose presence or value depends on the values of other fields within the same node or context.
- **Schema Definition**: - **Schema Definition**:
- Class validation rules must be defined in a separate schema file. - Class validation rules must be defined in a separate schema file using the **CUE** language.
- **Project-Specific Classes**: Developers can define their own project-specific classes and corresponding validation rules, expanding the validation capabilities for their specific needs. - **Project-Specific Classes**: Developers can define their own project-specific classes and corresponding validation rules, expanding the validation capabilities for their specific needs.
- **Schema Loading**: - **Schema Loading**:
- **Default Schema**: The tool should look for a default schema file `marte_schema.json` in standard system locations: - **Default Schema**: The tool should look for a default schema file `marte_schema.cue` in standard system locations:
- `/usr/share/mdt/marte_schema.json` - `/usr/share/mdt/marte_schema.cue`
- `$HOME/.local/share/mdt/marte_schema.json` - `$HOME/.local/share/mdt/marte_schema.cue`
- **Project Schema**: If a file named `.marte_schema.json` exists in the project root, it must be loaded. - **Project Schema**: If a file named `.marte_schema.cue` exists in the project root, it must be loaded.
- **Merging**: The final schema is a merge of the built-in schema, the system default schema (if found), and the project-specific schema. Rules in later sources (Project > System > Built-in) append to or override earlier ones. - **Merging**: The final schema is a merge of the built-in schema, the system default schema (if found), and the project-specific schema. Rules in later sources (Project > System > Built-in) append to or override earlier ones.
- **Duplicate Fields**: - **Duplicate Fields**:
- **Constraint**: A field must not be defined more than once within the same object/node scope, even if those definitions are spread across different files. - **Constraint**: A field must not be defined more than once within the same object/node scope, even if those definitions are spread across different files.
@@ -183,18 +197,20 @@ The `fmt` command must format the code according to the following rules:
The LSP and `check` command should report the following: The LSP and `check` command should report the following:
- **Warnings**: - **Warnings**:
- **Unused GAM**: A GAM is defined but not referenced in any thread or scheduler. - **Unused GAM**: A GAM is defined but not referenced in any thread or scheduler. (Suppress with `//!unused`)
- **Unused Signal**: A signal is explicitly defined in a `DataSource` but never referenced in any `GAM`. - **Unused Signal**: A signal is explicitly defined in a `DataSource` but never referenced in any `GAM`. (Suppress with `//!unused`)
- **Implicitly Defined Signal**: A signal is defined only within a `GAM` and not in its parent `DataSource`. - **Implicitly Defined Signal**: A signal is defined only within a `GAM` and not in its parent `DataSource`. (Suppress with `//!implicit`)
- **Errors**: - **Errors**:
- **Type Inconsistency**: A signal is referenced with a type different from its definition. - **Type Inconsistency**: A signal is referenced with a type different from its definition. (Suppress with `//!cast`)
- **Size Inconsistency**: A signal is referenced with a size (dimensions/elements) different from its definition. - **Size Inconsistency**: A signal is referenced with a size (dimensions/elements) different from its definition.
- **Invalid Signal Content**: The `Signals` container of a `DataSource` contains invalid elements (e.g., fields instead of nodes).
- **Duplicate Field Definition**: A field is defined multiple times within the same node scope (including across multiple files). - **Duplicate Field Definition**: A field is defined multiple times within the same node scope (including across multiple files).
- **Validation Errors**: - **Validation Errors**:
- Missing mandatory fields. - Missing mandatory fields.
- Field type mismatches. - Field type mismatches.
- Grammar errors (e.g., missing closing brackets). - Grammar errors (e.g., missing closing brackets).
- **Invalid Function Reference**: Elements in the `Functions` array of a `State.Thread` must be valid references to defined GAM nodes.
## Logging ## Logging

View File

@@ -5,7 +5,7 @@ import (
"strings" "strings"
"testing" "testing"
"github.com/marte-dev/marte-dev-tools/internal/builder" "github.com/marte-community/marte-dev-tools/internal/builder"
) )
func TestMultiFileBuildMergeAndOrder(t *testing.T) { func TestMultiFileBuildMergeAndOrder(t *testing.T) {

View File

@@ -7,11 +7,11 @@ import (
"strings" "strings"
"testing" "testing"
"github.com/marte-dev/marte-dev-tools/internal/builder" "github.com/marte-community/marte-dev-tools/internal/builder"
"github.com/marte-dev/marte-dev-tools/internal/formatter" "github.com/marte-community/marte-dev-tools/internal/formatter"
"github.com/marte-dev/marte-dev-tools/internal/index" "github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-dev/marte-dev-tools/internal/parser" "github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-dev/marte-dev-tools/internal/validator" "github.com/marte-community/marte-dev-tools/internal/validator"
) )
func TestCheckCommand(t *testing.T) { func TestCheckCommand(t *testing.T) {

320
test/lsp_completion_test.go Normal file
View File

@@ -0,0 +1,320 @@
package integration
import (
"strings"
"testing"
"github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-community/marte-dev-tools/internal/lsp"
"github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-community/marte-dev-tools/internal/schema"
)
func TestHandleCompletion(t *testing.T) {
setup := func() {
lsp.Tree = index.NewProjectTree()
lsp.Documents = make(map[string]string)
lsp.ProjectRoot = "."
lsp.GlobalSchema = schema.NewSchema()
}
uri := "file://test.marte"
path := "test.marte"
t.Run("Suggest Classes", func(t *testing.T) {
setup()
content := "+Obj = { Class = "
lsp.Documents[uri] = content
params := lsp.CompletionParams{
TextDocument: lsp.TextDocumentIdentifier{URI: uri},
Position: lsp.Position{Line: 0, Character: len(content)},
}
list := lsp.HandleCompletion(params)
if list == nil || len(list.Items) == 0 {
t.Fatal("Expected class suggestions, got none")
}
found := false
for _, item := range list.Items {
if item.Label == "RealTimeApplication" {
found = true
break
}
}
if !found {
t.Error("Expected RealTimeApplication in class suggestions")
}
})
t.Run("Suggest Fields", func(t *testing.T) {
setup()
content := `
+MyApp = {
Class = RealTimeApplication
}
`
lsp.Documents[uri] = content
p := parser.NewParser(content)
cfg, _ := p.Parse()
lsp.Tree.AddFile(path, cfg)
// Position at line 3 (empty line inside MyApp)
params := lsp.CompletionParams{
TextDocument: lsp.TextDocumentIdentifier{URI: uri},
Position: lsp.Position{Line: 3, Character: 4},
}
list := lsp.HandleCompletion(params)
if list == nil || len(list.Items) == 0 {
t.Fatal("Expected field suggestions, got none")
}
foundData := false
for _, item := range list.Items {
if item.Label == "Data" {
foundData = true
if item.Detail != "Mandatory" {
t.Errorf("Expected Data to be Mandatory, got %s", item.Detail)
}
}
}
if !foundData {
t.Error("Expected 'Data' in field suggestions for RealTimeApplication")
}
})
t.Run("Suggest References (DataSource)", func(t *testing.T) {
setup()
content := `
$App = {
$Data = {
+InDS = {
Class = FileReader
+Signals = {
Sig1 = { Type = uint32 }
}
}
}
}
+MyGAM = {
Class = IOGAM
+InputSignals = {
S1 = { DataSource = }
}
}
`
lsp.Documents[uri] = content
p := parser.NewParser(content)
cfg, _ := p.Parse()
lsp.Tree.AddFile(path, cfg)
lsp.Tree.ResolveReferences()
// Position at end of "DataSource = "
params := lsp.CompletionParams{
TextDocument: lsp.TextDocumentIdentifier{URI: uri},
Position: lsp.Position{Line: 14, Character: 28},
}
list := lsp.HandleCompletion(params)
if list == nil || len(list.Items) == 0 {
t.Fatal("Expected DataSource suggestions, got none")
}
foundDS := false
for _, item := range list.Items {
if item.Label == "InDS" {
foundDS = true
break
}
}
if !foundDS {
t.Error("Expected 'InDS' in suggestions for DataSource field")
}
})
t.Run("Filter Existing Fields", func(t *testing.T) {
setup()
content := `
+MyThread = {
Class = RealTimeThread
Functions = { }
}
`
lsp.Documents[uri] = content
p := parser.NewParser(content)
cfg, _ := p.Parse()
lsp.Tree.AddFile(path, cfg)
// Position at line 4
params := lsp.CompletionParams{
TextDocument: lsp.TextDocumentIdentifier{URI: uri},
Position: lsp.Position{Line: 4, Character: 4},
}
list := lsp.HandleCompletion(params)
for _, item := range list.Items {
if item.Label == "Functions" || item.Label == "Class" {
t.Errorf("Did not expect already defined field %s in suggestions", item.Label)
}
}
})
t.Run("Scope-aware suggestions", func(t *testing.T) {
setup()
// Define a project DataSource in one file
cfg1, _ := parser.NewParser("#package MYPROJ.Data\n+ProjectDS = { Class = FileReader +Signals = { S1 = { Type = int32 } } }").Parse()
lsp.Tree.AddFile("project_ds.marte", cfg1)
// Define an isolated file
contentIso := "+MyGAM = { Class = IOGAM +InputSignals = { S1 = { DataSource = } } }"
lsp.Documents["file://iso.marte"] = contentIso
cfg2, _ := parser.NewParser(contentIso).Parse()
lsp.Tree.AddFile("iso.marte", cfg2)
lsp.Tree.ResolveReferences()
// Completion in isolated file
params := lsp.CompletionParams{
TextDocument: lsp.TextDocumentIdentifier{URI: "file://iso.marte"},
Position: lsp.Position{Line: 0, Character: strings.Index(contentIso, "DataSource = ") + len("DataSource = ") + 1},
}
list := lsp.HandleCompletion(params)
foundProjectDS := false
if list != nil {
for _, item := range list.Items {
if item.Label == "ProjectDS" {
foundProjectDS = true
break
}
}
}
if foundProjectDS {
t.Error("Did not expect ProjectDS in isolated file suggestions")
}
// Completion in a project file
lineContent := "+MyGAM = { Class = IOGAM +InputSignals = { S1 = { DataSource = Dummy } } }"
contentPrj := "#package MYPROJ.App\n" + lineContent
lsp.Documents["file://prj.marte"] = contentPrj
pPrj := parser.NewParser(contentPrj)
cfg3, err := pPrj.Parse()
if err != nil {
t.Logf("Parser error in contentPrj: %v", err)
}
lsp.Tree.AddFile("prj.marte", cfg3)
lsp.Tree.ResolveReferences()
paramsPrj := lsp.CompletionParams{
TextDocument: lsp.TextDocumentIdentifier{URI: "file://prj.marte"},
Position: lsp.Position{Line: 1, Character: strings.Index(lineContent, "Dummy")},
}
listPrj := lsp.HandleCompletion(paramsPrj)
foundProjectDS = false
if listPrj != nil {
for _, item := range listPrj.Items {
if item.Label == "ProjectDS" {
foundProjectDS = true
break
}
}
}
if !foundProjectDS {
t.Error("Expected ProjectDS in project file suggestions")
}
})
t.Run("Suggest Signal Types", func(t *testing.T) {
setup()
content := `
+DS = {
Class = FileReader
Signals = {
S1 = { Type = }
}
}
`
lsp.Documents[uri] = content
p := parser.NewParser(content)
cfg, _ := p.Parse()
lsp.Tree.AddFile(path, cfg)
params := lsp.CompletionParams{
TextDocument: lsp.TextDocumentIdentifier{URI: uri},
Position: lsp.Position{Line: 4, Character: strings.Index(content, "Type = ") + len("Type = ") + 1},
}
list := lsp.HandleCompletion(params)
if list == nil {
t.Fatal("Expected signal type suggestions")
}
foundUint32 := false
for _, item := range list.Items {
if item.Label == "uint32" {
foundUint32 = true
break
}
}
if !foundUint32 {
t.Error("Expected uint32 in suggestions")
}
})
t.Run("Suggest CUE Enums", func(t *testing.T) {
setup()
// Inject custom schema with enum
custom := []byte(`
package schema
#Classes: {
TestEnumClass: {
Mode: "Auto" | "Manual"
}
}
`)
val := lsp.GlobalSchema.Context.CompileBytes(custom)
lsp.GlobalSchema.Value = lsp.GlobalSchema.Value.Unify(val)
content := `
+Obj = {
Class = TestEnumClass
Mode =
}
`
lsp.Documents[uri] = content
p := parser.NewParser(content)
cfg, _ := p.Parse()
lsp.Tree.AddFile(path, cfg)
params := lsp.CompletionParams{
TextDocument: lsp.TextDocumentIdentifier{URI: uri},
Position: lsp.Position{Line: 3, Character: strings.Index(content, "Mode = ") + len("Mode = ") + 1},
}
list := lsp.HandleCompletion(params)
if list == nil {
t.Fatal("Expected enum suggestions")
}
foundAuto := false
for _, item := range list.Items {
if item.Label == "\"Auto\"" { // CUE string value includes quotes
foundAuto = true
break
}
}
if !foundAuto {
// Check if it returned without quotes?
// v.String() returns quoted for string.
t.Error("Expected \"Auto\" in suggestions")
for _, item := range list.Items {
t.Logf("Suggestion: %s", item.Label)
}
}
})
}

View File

@@ -3,8 +3,8 @@ package integration
import ( import (
"testing" "testing"
"github.com/marte-dev/marte-dev-tools/internal/index" "github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-dev/marte-dev-tools/internal/parser" "github.com/marte-community/marte-dev-tools/internal/parser"
) )
func TestLSPHoverDoc(t *testing.T) { func TestLSPHoverDoc(t *testing.T) {

View File

@@ -0,0 +1,73 @@
package integration
import (
"testing"
"github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-community/marte-dev-tools/internal/parser"
)
func TestGetNodeContaining(t *testing.T) {
content := `
+App = {
Class = RealTimeApplication
+State1 = {
Class = RealTimeState
+Thread1 = {
Class = RealTimeThread
Functions = { GAM1 }
}
}
}
+GAM1 = { Class = IOGAM }
`
p := parser.NewParser(content)
config, err := p.Parse()
if err != nil {
t.Fatalf("Parse failed: %v", err)
}
idx := index.NewProjectTree()
file := "hover_context.marte"
idx.AddFile(file, config)
idx.ResolveReferences()
// Find reference to GAM1
var gamRef *index.Reference
for i := range idx.References {
ref := &idx.References[i]
if ref.Name == "GAM1" {
gamRef = ref
break
}
}
if gamRef == nil {
t.Fatal("Reference to GAM1 not found")
}
// Check containing node
container := idx.GetNodeContaining(file, gamRef.Position)
if container == nil {
t.Fatal("Container not found")
}
if container.RealName != "+Thread1" {
t.Errorf("Expected container +Thread1, got %s", container.RealName)
}
// Check traversal up to State
curr := container
foundState := false
for curr != nil {
if curr.RealName == "+State1" {
foundState = true
break
}
curr = curr.Parent
}
if !foundState {
t.Error("State parent not found")
}
}

199
test/lsp_server_test.go Normal file
View File

@@ -0,0 +1,199 @@
package integration
import (
"encoding/json"
"os"
"path/filepath"
"strings"
"testing"
"github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-community/marte-dev-tools/internal/lsp"
"github.com/marte-community/marte-dev-tools/internal/parser"
)
func TestInitProjectScan(t *testing.T) {
// 1. Setup temp dir with files
tmpDir, err := os.MkdirTemp("", "lsp_test")
if err != nil {
t.Fatal(err)
}
defer os.RemoveAll(tmpDir)
// File 1: Definition
if err := os.WriteFile(filepath.Join(tmpDir, "def.marte"), []byte("#package Test.Common\n+Target = { Class = C }"), 0644); err != nil {
t.Fatal(err)
}
// File 2: Reference
if err := os.WriteFile(filepath.Join(tmpDir, "ref.marte"), []byte("#package Test.Common\n+Source = { Class = C Link = Target }"), 0644); err != nil {
t.Fatal(err)
}
// 2. Initialize
lsp.Tree = index.NewProjectTree() // Reset global tree
initParams := lsp.InitializeParams{RootPath: tmpDir}
paramsBytes, _ := json.Marshal(initParams)
msg := &lsp.JsonRpcMessage{
Method: "initialize",
Params: paramsBytes,
ID: 1,
}
lsp.HandleMessage(msg)
// Query the reference in ref.marte at "Target"
defParams := lsp.DefinitionParams{
TextDocument: lsp.TextDocumentIdentifier{URI: "file://" + filepath.Join(tmpDir, "ref.marte")},
Position: lsp.Position{Line: 1, Character: 29},
}
res := lsp.HandleDefinition(defParams)
if res == nil {
t.Fatal("Definition not found via LSP after initialization")
}
locs, ok := res.([]lsp.Location)
if !ok {
t.Fatalf("Expected []lsp.Location, got %T", res)
}
if len(locs) == 0 {
t.Fatal("No locations found")
}
// Verify uri points to def.marte
expectedURI := "file://" + filepath.Join(tmpDir, "def.marte")
if locs[0].URI != expectedURI {
t.Errorf("Expected URI %s, got %s", expectedURI, locs[0].URI)
}
}
func TestHandleDefinition(t *testing.T) {
// Reset tree for test
lsp.Tree = index.NewProjectTree()
content := `
+MyObject = {
Class = Type
}
+RefObject = {
Class = Type
RefField = MyObject
}
`
path := "/test.marte"
p := parser.NewParser(content)
config, err := p.Parse()
if err != nil {
t.Fatalf("Parse failed: %v", err)
}
lsp.Tree.AddFile(path, config)
lsp.Tree.ResolveReferences()
t.Logf("Refs: %d", len(lsp.Tree.References))
for _, r := range lsp.Tree.References {
t.Logf(" %s at %d:%d", r.Name, r.Position.Line, r.Position.Column)
}
// Test Go to Definition on MyObject reference
params := lsp.DefinitionParams{
TextDocument: lsp.TextDocumentIdentifier{URI: "file://" + path},
Position: lsp.Position{Line: 6, Character: 15}, // "MyObject" in RefField = MyObject
}
result := lsp.HandleDefinition(params)
if result == nil {
t.Fatal("HandleDefinition returned nil")
}
locations, ok := result.([]lsp.Location)
if !ok {
t.Fatalf("Expected []lsp.Location, got %T", result)
}
if len(locations) != 1 {
t.Fatalf("Expected 1 location, got %d", len(locations))
}
if locations[0].Range.Start.Line != 1 { // +MyObject is on line 2 (0-indexed 1)
t.Errorf("Expected definition on line 1, got %d", locations[0].Range.Start.Line)
}
}
func TestHandleReferences(t *testing.T) {
// Reset tree for test
lsp.Tree = index.NewProjectTree()
content := `
+MyObject = {
Class = Type
}
+RefObject = {
Class = Type
RefField = MyObject
}
+AnotherRef = {
Ref = MyObject
}
`
path := "/test.marte"
p := parser.NewParser(content)
config, err := p.Parse()
if err != nil {
t.Fatalf("Parse failed: %v", err)
}
lsp.Tree.AddFile(path, config)
lsp.Tree.ResolveReferences()
// Test Find References for MyObject (triggered from its definition)
params := lsp.ReferenceParams{
TextDocument: lsp.TextDocumentIdentifier{URI: "file://" + path},
Position: lsp.Position{Line: 1, Character: 1}, // "+MyObject"
Context: lsp.ReferenceContext{IncludeDeclaration: true},
}
locations := lsp.HandleReferences(params)
if len(locations) != 3 { // 1 declaration + 2 references
t.Fatalf("Expected 3 locations, got %d", len(locations))
}
}
func TestLSPFormatting(t *testing.T) {
// Setup
content := `
#package Proj.Main
+Object={
Field=1
}
`
uri := "file:///test.marte"
// Open (populate Documents map)
lsp.Documents[uri] = content
// Format
params := lsp.DocumentFormattingParams{
TextDocument: lsp.TextDocumentIdentifier{URI: uri},
}
edits := lsp.HandleFormatting(params)
if len(edits) != 1 {
t.Fatalf("Expected 1 edit, got %d", len(edits))
}
newText := edits[0].NewText
expected := `#package Proj.Main
+Object = {
Field = 1
}
`
// Normalize newlines for comparison just in case
if strings.TrimSpace(strings.ReplaceAll(newText, "\r\n", "\n")) != strings.TrimSpace(strings.ReplaceAll(expected, "\r\n", "\n")) {
t.Errorf("Formatting mismatch.\nExpected:\n%s\nGot:\n%s", expected, newText)
}
}

View File

@@ -3,18 +3,32 @@ package integration
import ( import (
"testing" "testing"
"github.com/marte-dev/marte-dev-tools/internal/index" "github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-dev/marte-dev-tools/internal/parser" "github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-community/marte-dev-tools/internal/validator"
) )
func TestLSPSignalMetadata(t *testing.T) { func TestLSPSignalReferences(t *testing.T) {
content := ` content := `
+MySignal = { +Data = {
Class = Signal Class = ReferenceContainer
+MyDS = {
Class = FileReader
Filename = "test"
Signals = {
MySig = { Type = uint32 }
}
}
}
+MyGAM = {
Class = IOGAM
InputSignals = {
MySig = {
DataSource = MyDS
Type = uint32 Type = uint32
NumberOfElements = 10 }
NumberOfDimensions = 1 }
DataSource = DDB1
} }
` `
p := parser.NewParser(content) p := parser.NewParser(content)
@@ -24,26 +38,56 @@ func TestLSPSignalMetadata(t *testing.T) {
} }
idx := index.NewProjectTree() idx := index.NewProjectTree()
file := "signal.marte" idx.AddFile("signal_refs.marte", config)
idx.AddFile(file, config) idx.ResolveReferences()
res := idx.Query(file, 2, 2) // Query +MySignal v := validator.NewValidator(idx, ".")
if res == nil || res.Node == nil { v.ValidateProject()
t.Fatal("Query failed for signal definition")
// Find definition of MySig in MyDS
root := idx.IsolatedFiles["signal_refs.marte"]
if root == nil {
t.Fatal("Root node not found")
} }
meta := res.Node.Metadata // Traverse to MySig
if meta["Class"] != "Signal" { dataNode := root.Children["Data"]
t.Errorf("Expected Class Signal, got %s", meta["Class"]) if dataNode == nil {
} t.Fatal("Data node not found")
if meta["Type"] != "uint32" {
t.Errorf("Expected Type uint32, got %s", meta["Type"])
}
if meta["NumberOfElements"] != "10" {
t.Errorf("Expected 10 elements, got %s", meta["NumberOfElements"])
} }
// Since handleHover logic is in internal/lsp which we can't easily test directly without myDS := dataNode.Children["MyDS"]
// exposing formatNodeInfo, we rely on the fact that Metadata is populated correctly. if myDS == nil {
// If Metadata is correct, server.go logic (verified by code review) should display it. t.Fatal("MyDS node not found")
}
signals := myDS.Children["Signals"]
if signals == nil {
t.Fatal("Signals node not found")
}
mySigDef := signals.Children["MySig"]
if mySigDef == nil {
t.Fatal("Definition of MySig not found in tree")
}
// Now simulate "Find References" on mySigDef
foundRefs := 0
idx.Walk(func(node *index.ProjectNode) {
if node.Target == mySigDef {
foundRefs++
// Check if node is the GAM signal
if node.RealName != "MySig" { // In GAM it is MySig
t.Errorf("Unexpected reference node name: %s", node.RealName)
}
// Check parent is InputSignals -> MyGAM
if node.Parent == nil || node.Parent.Parent == nil || node.Parent.Parent.RealName != "+MyGAM" {
t.Errorf("Reference node not in MyGAM")
}
}
})
if foundRefs != 1 {
t.Errorf("Expected 1 reference (Direct), found %d", foundRefs)
}
} }

View File

@@ -4,9 +4,9 @@ import (
"io/ioutil" "io/ioutil"
"testing" "testing"
"github.com/marte-dev/marte-dev-tools/internal/index" "github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-dev/marte-dev-tools/internal/parser" "github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-dev/marte-dev-tools/internal/validator" "github.com/marte-community/marte-dev-tools/internal/validator"
) )
// Helper to load and parse a file // Helper to load and parse a file
@@ -140,3 +140,16 @@ func TestLSPHover(t *testing.T) {
t.Errorf("Expected +MyObject, got %s", res.Node.RealName) t.Errorf("Expected +MyObject, got %s", res.Node.RealName)
} }
} }
func TestParserError(t *testing.T) {
invalidContent := `
A = {
Field =
}
`
p := parser.NewParser(invalidContent)
_, err := p.Parse()
if err == nil {
t.Fatal("Expected parser error, got nil")
}
}

View File

@@ -0,0 +1,35 @@
package integration
import (
"testing"
"github.com/marte-community/marte-dev-tools/internal/parser"
)
func TestParserStrictness(t *testing.T) {
// Case 1: content not a definition (missing =)
invalidDef := `
A = {
Field = 10
XXX
}
`
p := parser.NewParser(invalidDef)
_, err := p.Parse()
if err == nil {
t.Error("Expected error for invalid definition XXX, got nil")
}
// Case 2: Missing closing bracket
missingBrace := `
A = {
SUBNODE = {
FIELD = 10
}
`
p2 := parser.NewParser(missingBrace)
_, err2 := p2.Parse()
if err2 == nil {
t.Error("Expected error for missing closing bracket, got nil")
}
}

View File

@@ -1,7 +1,9 @@
package parser package integration
import ( import (
"testing" "testing"
"github.com/marte-community/marte-dev-tools/internal/parser"
) )
func TestParseBasic(t *testing.T) { func TestParseBasic(t *testing.T) {
@@ -22,7 +24,7 @@ $Node2 = {
Array = {1 2 3} Array = {1 2 3}
} }
` `
p := NewParser(input) p := parser.NewParser(input)
config, err := p.Parse() config, err := p.Parse()
if err != nil { if err != nil {
t.Fatalf("Parse error: %v", err) t.Fatalf("Parse error: %v", err)

View File

@@ -4,9 +4,9 @@ import (
"strings" "strings"
"testing" "testing"
"github.com/marte-dev/marte-dev-tools/internal/index" "github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-dev/marte-dev-tools/internal/parser" "github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-dev/marte-dev-tools/internal/validator" "github.com/marte-community/marte-dev-tools/internal/validator"
) )
func TestMDSWriterValidation(t *testing.T) { func TestMDSWriterValidation(t *testing.T) {
@@ -38,7 +38,7 @@ func TestMDSWriterValidation(t *testing.T) {
found := false found := false
for _, d := range v.Diagnostics { for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "Missing mandatory field 'TreeName'") { if strings.Contains(d.Message, "TreeName: incomplete value") {
found = true found = true
break break
} }
@@ -71,7 +71,7 @@ func TestMathExpressionGAMValidation(t *testing.T) {
found := false found := false
for _, d := range v.Diagnostics { for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "Missing mandatory field 'Expression'") { if strings.Contains(d.Message, "Expression: incomplete value") {
found = true found = true
break break
} }

View File

@@ -4,9 +4,9 @@ import (
"strings" "strings"
"testing" "testing"
"github.com/marte-dev/marte-dev-tools/internal/index" "github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-dev/marte-dev-tools/internal/parser" "github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-dev/marte-dev-tools/internal/validator" "github.com/marte-community/marte-dev-tools/internal/validator"
) )
func TestPIDGAMValidation(t *testing.T) { func TestPIDGAMValidation(t *testing.T) {
@@ -35,10 +35,10 @@ func TestPIDGAMValidation(t *testing.T) {
foundKd := false foundKd := false
for _, d := range v.Diagnostics { for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "Missing mandatory field 'Ki'") { if strings.Contains(d.Message, "Ki: incomplete value") {
foundKi = true foundKi = true
} }
if strings.Contains(d.Message, "Missing mandatory field 'Kd'") { if strings.Contains(d.Message, "Kd: incomplete value") {
foundKd = true foundKd = true
} }
} }
@@ -73,7 +73,7 @@ func TestFileDataSourceValidation(t *testing.T) {
found := false found := false
for _, d := range v.Diagnostics { for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "Missing mandatory field 'Filename'") { if strings.Contains(d.Message, "Filename: incomplete value") {
found = true found = true
break break
} }

View File

@@ -4,9 +4,9 @@ import (
"strings" "strings"
"testing" "testing"
"github.com/marte-dev/marte-dev-tools/internal/index" "github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-dev/marte-dev-tools/internal/parser" "github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-dev/marte-dev-tools/internal/validator" "github.com/marte-community/marte-dev-tools/internal/validator"
) )
func TestRealTimeApplicationValidation(t *testing.T) { func TestRealTimeApplicationValidation(t *testing.T) {
@@ -35,14 +35,20 @@ func TestRealTimeApplicationValidation(t *testing.T) {
missingStates := false missingStates := false
for _, d := range v.Diagnostics { for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "Missing mandatory field 'Data'") { if strings.Contains(d.Message, "Data: field is required") {
missingData = true missingData = true
} }
if strings.Contains(d.Message, "Missing mandatory field 'States'") { if strings.Contains(d.Message, "States: field is required") {
missingStates = true missingStates = true
} }
} }
if !missingData || !missingStates {
for _, d := range v.Diagnostics {
t.Logf("Diagnostic: %s", d.Message)
}
}
if !missingData { if !missingData {
t.Error("Expected error for missing 'Data' field in RealTimeApplication") t.Error("Expected error for missing 'Data' field in RealTimeApplication")
} }
@@ -73,7 +79,7 @@ func TestGAMSchedulerValidation(t *testing.T) {
found := false found := false
for _, d := range v.Diagnostics { for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "Missing mandatory field 'TimingDataSource'") { if strings.Contains(d.Message, "TimingDataSource: incomplete value") {
found = true found = true
break break
} }

View File

@@ -4,9 +4,9 @@ import (
"strings" "strings"
"testing" "testing"
"github.com/marte-dev/marte-dev-tools/internal/index" "github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-dev/marte-dev-tools/internal/parser" "github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-dev/marte-dev-tools/internal/validator" "github.com/marte-community/marte-dev-tools/internal/validator"
) )
func TestSDNSubscriberValidation(t *testing.T) { func TestSDNSubscriberValidation(t *testing.T) {
@@ -32,7 +32,7 @@ func TestSDNSubscriberValidation(t *testing.T) {
found := false found := false
for _, d := range v.Diagnostics { for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "Missing mandatory field 'Port'") { if strings.Contains(d.Message, "Port: incomplete value") {
found = true found = true
break break
} }
@@ -65,7 +65,7 @@ func TestFileWriterValidation(t *testing.T) {
found := false found := false
for _, d := range v.Diagnostics { for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "Missing mandatory field 'Filename'") { if strings.Contains(d.Message, "Filename: incomplete value") {
found = true found = true
break break
} }

View File

@@ -0,0 +1,74 @@
package integration
import (
"strings"
"testing"
"github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-community/marte-dev-tools/internal/validator"
)
func TestFunctionsArrayValidation(t *testing.T) {
content := `
+App = {
Class = RealTimeApplication
+State = {
Class = RealTimeState
+Thread = {
Class = RealTimeThread
Functions = {
ValidGAM,
InvalidGAM, // Not a GAM (DataSource)
MissingGAM, // Not found
"String", // Not reference
}
}
}
}
+ValidGAM = { Class = IOGAM InputSignals = {} }
+InvalidGAM = { Class = FileReader }
`
p := parser.NewParser(content)
config, err := p.Parse()
if err != nil {
t.Fatalf("Parse failed: %v", err)
}
idx := index.NewProjectTree()
idx.AddFile("funcs.marte", config)
idx.ResolveReferences()
v := validator.NewValidator(idx, ".")
v.ValidateProject()
foundInvalid := false
foundMissing := false
foundNotRef := false
for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "not found or is not a valid GAM") {
// This covers both InvalidGAM and MissingGAM cases
if strings.Contains(d.Message, "InvalidGAM") {
foundInvalid = true
}
if strings.Contains(d.Message, "MissingGAM") {
foundMissing = true
}
}
if strings.Contains(d.Message, "must contain references") {
foundNotRef = true
}
}
if !foundInvalid {
t.Error("Expected error for InvalidGAM")
}
if !foundMissing {
t.Error("Expected error for MissingGAM")
}
if !foundNotRef {
t.Error("Expected error for non-reference element")
}
}

View File

@@ -0,0 +1,85 @@
package integration
import (
"strings"
"testing"
"github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-community/marte-dev-tools/internal/validator"
)
func TestGAMSignalDirectionality(t *testing.T) {
content := `
$App = {
$Data = {
+InDS = { Class = FileReader Filename="f" +Signals = { S1 = { Type = uint32 } } }
+OutDS = { Class = FileWriter Filename="f" +Signals = { S1 = { Type = uint32 } } }
+InOutDS = { Class = FileDataSource Filename="f" +Signals = { S1 = { Type = uint32 } } }
}
+ValidGAM = {
Class = IOGAM
InputSignals = {
S1 = { DataSource = InDS }
S2 = { DataSource = InOutDS Alias = S1 }
}
OutputSignals = {
S3 = { DataSource = OutDS Alias = S1 }
S4 = { DataSource = InOutDS Alias = S1 }
}
}
+InvalidGAM = {
Class = IOGAM
InputSignals = {
BadIn = { DataSource = OutDS Alias = S1 }
}
OutputSignals = {
BadOut = { DataSource = InDS Alias = S1 }
}
}
}
`
p := parser.NewParser(content)
config, err := p.Parse()
if err != nil {
t.Fatalf("Parse failed: %v", err)
}
idx := index.NewProjectTree()
idx.AddFile("dir.marte", config)
idx.ResolveReferences()
v := validator.NewValidator(idx, ".")
v.ValidateProject()
// Check ValidGAM has NO directionality errors
for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "is Output-only but referenced in InputSignals") ||
strings.Contains(d.Message, "is Input-only but referenced in OutputSignals") {
if strings.Contains(d.Message, "ValidGAM") {
t.Errorf("Unexpected direction error for ValidGAM: %s", d.Message)
}
}
}
// Check InvalidGAM HAS errors
foundBadIn := false
foundBadOut := false
for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "InvalidGAM") {
if strings.Contains(d.Message, "is Output-only but referenced in InputSignals") {
foundBadIn = true
}
if strings.Contains(d.Message, "is Input-only but referenced in OutputSignals") {
foundBadOut = true
}
}
}
if !foundBadIn {
t.Error("Expected error for OutDS in InputSignals of InvalidGAM")
}
if !foundBadOut {
t.Error("Expected error for InDS in OutputSignals of InvalidGAM")
}
}

View File

@@ -0,0 +1,81 @@
package integration
import (
"testing"
"github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-community/marte-dev-tools/internal/validator"
)
func TestGAMSignalLinking(t *testing.T) {
content := `
+Data = {
Class = ReferenceContainer
+MyDS = {
Class = FileReader
Filename = "test.txt"
Signals = {
MySig = { Type = uint32 }
}
}
}
+MyGAM = {
Class = IOGAM
InputSignals = {
MySig = {
DataSource = MyDS
Type = uint32
}
AliasedSig = {
Alias = MySig
DataSource = MyDS
Type = uint32
}
}
}
`
p := parser.NewParser(content)
config, err := p.Parse()
if err != nil {
t.Fatalf("Parse failed: %v", err)
}
idx := index.NewProjectTree()
idx.AddFile("gam_signals_linking.marte", config)
idx.ResolveReferences()
v := validator.NewValidator(idx, ".")
v.ValidateProject()
if len(v.Diagnostics) > 0 {
for _, d := range v.Diagnostics {
t.Logf("Diagnostic: %s", d.Message)
}
t.Fatalf("Validation failed with %d issues", len(v.Diagnostics))
}
foundMyDSRef := 0
foundAliasRef := 0
for _, ref := range idx.References {
if ref.Name == "MyDS" {
if ref.Target != nil && ref.Target.RealName == "+MyDS" {
foundMyDSRef++
}
}
if ref.Name == "MySig" {
if ref.Target != nil && ref.Target.RealName == "MySig" {
foundAliasRef++
}
}
}
if foundMyDSRef < 2 {
t.Errorf("Expected at least 2 resolved MyDS references, found %d", foundMyDSRef)
}
if foundAliasRef < 1 {
t.Errorf("Expected at least 1 resolved Alias MySig reference, found %d", foundAliasRef)
}
}

View File

@@ -0,0 +1,108 @@
package integration
import (
"strings"
"testing"
"github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-community/marte-dev-tools/internal/validator"
)
func TestGAMSignalValidation(t *testing.T) {
content := `
+Data = {
Class = ReferenceContainer
+InDS = {
Class = FileReader
Signals = {
SigIn = { Type = uint32 }
}
}
+OutDS = {
Class = FileWriter
Signals = {
SigOut = { Type = uint32 }
}
}
}
+MyGAM = {
Class = IOGAM
InputSignals = {
SigIn = {
DataSource = InDS
Type = uint32
}
// Error: OutDS is OUT only
BadInput = {
DataSource = OutDS
Alias = SigOut
Type = uint32
}
// Error: MissingSig not in InDS
Missing = {
DataSource = InDS
Alias = MissingSig
Type = uint32
}
}
OutputSignals = {
SigOut = {
DataSource = OutDS
Type = uint32
}
// Error: InDS is IN only
BadOutput = {
DataSource = InDS
Alias = SigIn
Type = uint32
}
}
}
`
p := parser.NewParser(content)
config, err := p.Parse()
if err != nil {
t.Fatalf("Parse failed: %v", err)
}
idx := index.NewProjectTree()
idx.AddFile("gam_signals.marte", config)
idx.ResolveReferences()
v := validator.NewValidator(idx, ".")
v.ValidateProject()
foundBadInput := false
foundMissing := false
foundBadOutput := false
for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "DataSource 'OutDS' (Class FileWriter) is Output-only but referenced in InputSignals") {
foundBadInput = true
}
if strings.Contains(d.Message, "Implicitly Defined Signal: 'MissingSig'") {
foundMissing = true
}
if strings.Contains(d.Message, "DataSource 'InDS' (Class FileReader) is Input-only but referenced in OutputSignals") {
foundBadOutput = true
}
}
if !foundBadInput || !foundMissing || !foundBadOutput {
for _, d := range v.Diagnostics {
t.Logf("Diagnostic: %s", d.Message)
}
}
if !foundBadInput {
t.Error("Expected error for OutDS in InputSignals")
}
if !foundMissing {
t.Error("Expected error for missing signal reference")
}
if !foundBadOutput {
t.Error("Expected error for InDS in OutputSignals")
}
}

View File

@@ -0,0 +1,65 @@
package integration
import (
"strings"
"testing"
"github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-community/marte-dev-tools/internal/validator"
)
func TestGlobalPragmaDebug(t *testing.T) {
content := `//! allow(implicit): Debugging
//! allow(unused): Debugging
+Data={Class=ReferenceContainer}
+GAM={Class=IOGAM InputSignals={Impl={DataSource=Data Type=uint32}}}
+UnusedGAM={Class=IOGAM}
`
p := parser.NewParser(content)
config, err := p.Parse()
if err != nil {
t.Fatalf("Parse failed: %v", err)
}
// Check if pragma parsed
if len(config.Pragmas) == 0 {
t.Fatal("Pragma not parsed")
}
t.Logf("Parsed Pragma 0: %s", config.Pragmas[0].Text)
idx := index.NewProjectTree()
idx.AddFile("debug.marte", config)
idx.ResolveReferences()
// Check if added to GlobalPragmas
pragmas, ok := idx.GlobalPragmas["debug.marte"]
if !ok || len(pragmas) == 0 {
t.Fatal("GlobalPragmas not populated")
}
t.Logf("Global Pragma stored: %s", pragmas[0])
v := validator.NewValidator(idx, ".")
v.ValidateProject()
v.CheckUnused() // Must call this for unused check!
foundImplicitWarning := false
foundUnusedWarning := false
for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "Implicitly Defined Signal") {
foundImplicitWarning = true
t.Logf("Found warning: %s", d.Message)
}
if strings.Contains(d.Message, "Unused GAM") {
foundUnusedWarning = true
t.Logf("Found warning: %s", d.Message)
}
}
if foundImplicitWarning {
t.Error("Expected implicit warning to be suppressed")
}
if foundUnusedWarning {
t.Error("Expected unused warning to be suppressed")
}
}

View File

@@ -0,0 +1,67 @@
package integration
import (
"strings"
"testing"
"github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-community/marte-dev-tools/internal/validator"
)
func TestGlobalPragma(t *testing.T) {
content := `
//!allow(unused): Suppress all unused
//!allow(implicit): Suppress all implicit
+Data = {
Class = ReferenceContainer
+MyDS = {
Class = FileReader
Filename = "test"
Signals = {
UnusedSig = { Type = uint32 }
}
}
}
+MyGAM = {
Class = IOGAM
InputSignals = {
ImplicitSig = { DataSource = MyDS Type = uint32 }
}
}
`
p := parser.NewParser(content)
config, err := p.Parse()
if err != nil {
t.Fatalf("Parse failed: %v", err)
}
idx := index.NewProjectTree()
idx.AddFile("global_pragma.marte", config)
idx.ResolveReferences()
v := validator.NewValidator(idx, ".")
v.ValidateProject()
v.CheckUnused()
foundUnusedWarning := false
foundImplicitWarning := false
for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "Unused Signal") {
foundUnusedWarning = true
}
if strings.Contains(d.Message, "Implicitly Defined Signal") {
foundImplicitWarning = true
}
}
if foundUnusedWarning {
t.Error("Expected warning for UnusedSig to be suppressed globally")
}
if foundImplicitWarning {
t.Error("Expected warning for ImplicitSig to be suppressed globally")
}
}

View File

@@ -0,0 +1,75 @@
package integration
import (
"strings"
"testing"
"github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-community/marte-dev-tools/internal/validator"
)
func TestGlobalPragmaUpdate(t *testing.T) {
// Scenario: Project scope. File A has pragma. File B has warning.
fileA := "fileA.marte"
contentA_WithPragma := `
#package my.project
//!allow(unused): Suppress
`
contentA_NoPragma := `
#package my.project
// No pragma
`
fileB := "fileB.marte"
contentB := `
#package my.project
+Data={Class=ReferenceContainer +DS={Class=FileReader Filename="t" Signals={Unused={Type=uint32}}}}
`
idx := index.NewProjectTree()
// Helper to validate
check := func() bool {
idx.ResolveReferences()
v := validator.NewValidator(idx, ".")
v.ValidateProject()
v.CheckUnused()
for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "Unused Signal") {
return true // Found warning
}
}
return false
}
// 1. Add A (with pragma) and B
pA := parser.NewParser(contentA_WithPragma)
cA, _ := pA.Parse()
idx.AddFile(fileA, cA)
pB := parser.NewParser(contentB)
cB, _ := pB.Parse()
idx.AddFile(fileB, cB)
if check() {
t.Error("Step 1: Expected warning to be suppressed")
}
// 2. Update A (remove pragma)
pA2 := parser.NewParser(contentA_NoPragma)
cA2, _ := pA2.Parse()
idx.AddFile(fileA, cA2)
if !check() {
t.Error("Step 2: Expected warning to appear")
}
// 3. Update A (add pragma back)
idx.AddFile(fileA, cA) // Re-use config A
if check() {
t.Error("Step 3: Expected warning to be suppressed again")
}
}

View File

@@ -0,0 +1,59 @@
package integration
import (
"strings"
"testing"
"github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-community/marte-dev-tools/internal/validator"
)
func TestIgnorePragma(t *testing.T) {
content := `
//!ignore(unused): Suppress global unused
+Data = {
Class = ReferenceContainer
+MyDS = {
Class = FileReader
Filename = "test"
Signals = {
Unused1 = { Type = uint32 }
//!ignore(unused): Suppress local unused
Unused2 = { Type = uint32 }
}
}
}
+MyGAM = {
Class = IOGAM
InputSignals = {
//!ignore(implicit): Suppress local implicit
ImplicitSig = { DataSource = MyDS Type = uint32 }
}
}
`
p := parser.NewParser(content)
config, err := p.Parse()
if err != nil {
t.Fatalf("Parse failed: %v", err)
}
idx := index.NewProjectTree()
idx.AddFile("ignore.marte", config)
idx.ResolveReferences()
v := validator.NewValidator(idx, ".")
v.ValidateProject()
v.CheckUnused()
for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "Unused Signal") {
t.Errorf("Unexpected warning: %s", d.Message)
}
if strings.Contains(d.Message, "Implicitly Defined Signal") {
t.Errorf("Unexpected warning: %s", d.Message)
}
}
}

View File

@@ -0,0 +1,107 @@
package integration
import (
"strings"
"testing"
"github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-community/marte-dev-tools/internal/validator"
)
func TestImplicitSignal(t *testing.T) {
content := `
+Data = {
Class = ReferenceContainer
+MyDS = {
Class = FileReader
Filename = "test"
Signals = {
ExplicitSig = { Type = uint32 }
}
}
}
+MyGAM = {
Class = IOGAM
InputSignals = {
ExplicitSig = {
DataSource = MyDS
Type = uint32
}
ImplicitSig = {
DataSource = MyDS
Type = uint32
}
}
}
`
p := parser.NewParser(content)
config, err := p.Parse()
if err != nil {
t.Fatalf("Parse failed: %v", err)
}
idx := index.NewProjectTree()
idx.AddFile("implicit_signal.marte", config)
idx.ResolveReferences()
v := validator.NewValidator(idx, ".")
v.ValidateProject()
foundWarning := false
foundError := false
for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "Implicitly Defined Signal") {
if strings.Contains(d.Message, "ImplicitSig") {
foundWarning = true
}
}
if strings.Contains(d.Message, "Signal 'ExplicitSig' not found") {
foundError = true
}
}
if !foundWarning || foundError {
for _, d := range v.Diagnostics {
t.Logf("Diagnostic: %s", d.Message)
}
}
if !foundWarning {
t.Error("Expected warning for ImplicitSig")
}
if foundError {
t.Error("Unexpected error for ExplicitSig")
}
// Test missing Type for implicit
contentMissingType := `
+Data = { Class = ReferenceContainer +DS={Class=FileReader Filename="" Signals={}} }
+GAM = { Class = IOGAM InputSignals = { Impl = { DataSource = DS } } }
`
p2 := parser.NewParser(contentMissingType)
config2, err2 := p2.Parse()
if err2 != nil {
t.Fatalf("Parse2 failed: %v", err2)
}
idx2 := index.NewProjectTree()
idx2.AddFile("missing_type.marte", config2)
idx2.ResolveReferences()
v2 := validator.NewValidator(idx2, ".")
v2.ValidateProject()
foundTypeErr := false
for _, d := range v2.Diagnostics {
if strings.Contains(d.Message, "Implicit signal 'Impl' must define Type") {
foundTypeErr = true
}
}
if !foundTypeErr {
for _, d := range v2.Diagnostics {
t.Logf("Diagnostic2: %s", d.Message)
}
t.Error("Expected error for missing Type in implicit signal")
}
}

View File

@@ -5,9 +5,9 @@ import (
"strings" "strings"
"testing" "testing"
"github.com/marte-dev/marte-dev-tools/internal/index" "github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-dev/marte-dev-tools/internal/parser" "github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-dev/marte-dev-tools/internal/validator" "github.com/marte-community/marte-dev-tools/internal/validator"
) )
func parseAndAddToIndex(t *testing.T, idx *index.ProjectTree, filePath string) { func parseAndAddToIndex(t *testing.T, idx *index.ProjectTree, filePath string) {

View File

@@ -0,0 +1,69 @@
package integration
import (
"strings"
"testing"
"github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-community/marte-dev-tools/internal/validator"
)
func TestPragmaSuppression(t *testing.T) {
content := `
+Data = {
Class = ReferenceContainer
+MyDS = {
Class = FileReader
Filename = "test"
Signals = {
//!unused: Ignore this
UnusedSig = { Type = uint32 }
UsedSig = { Type = uint32 }
}
}
}
+MyGAM = {
Class = IOGAM
InputSignals = {
UsedSig = { DataSource = MyDS Type = uint32 }
//!implicit: Ignore this implicit
ImplicitSig = { DataSource = MyDS Type = uint32 }
}
}
`
p := parser.NewParser(content)
config, err := p.Parse()
if err != nil {
t.Fatalf("Parse failed: %v", err)
}
idx := index.NewProjectTree()
idx.AddFile("pragma.marte", config)
idx.ResolveReferences()
v := validator.NewValidator(idx, ".")
v.ValidateProject()
v.CheckUnused()
foundUnusedWarning := false
foundImplicitWarning := false
for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "Unused Signal") && strings.Contains(d.Message, "UnusedSig") {
foundUnusedWarning = true
}
if strings.Contains(d.Message, "Implicitly Defined Signal") && strings.Contains(d.Message, "ImplicitSig") {
foundImplicitWarning = true
}
}
if foundUnusedWarning {
t.Error("Expected warning for UnusedSig to be suppressed")
}
if foundImplicitWarning {
t.Error("Expected warning for ImplicitSig to be suppressed")
}
}

View File

@@ -6,9 +6,9 @@ import (
"strings" "strings"
"testing" "testing"
"github.com/marte-dev/marte-dev-tools/internal/index" "github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-dev/marte-dev-tools/internal/parser" "github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-dev/marte-dev-tools/internal/validator" "github.com/marte-community/marte-dev-tools/internal/validator"
) )
func TestProjectSpecificSchema(t *testing.T) { func TestProjectSpecificSchema(t *testing.T) {
@@ -21,17 +21,16 @@ func TestProjectSpecificSchema(t *testing.T) {
// Define project schema // Define project schema
schemaContent := ` schemaContent := `
{ package schema
"classes": {
"ProjectClass": { #Classes: {
"fields": [ ProjectClass: {
{"name": "CustomField", "type": "int", "mandatory": true} CustomField: int
] ...
}
} }
} }
` `
err = os.WriteFile(filepath.Join(tmpDir, ".marte_schema.json"), []byte(schemaContent), 0644) err = os.WriteFile(filepath.Join(tmpDir, ".marte_schema.cue"), []byte(schemaContent), 0644)
if err != nil { if err != nil {
t.Fatal(err) t.Fatal(err)
} }
@@ -59,7 +58,7 @@ func TestProjectSpecificSchema(t *testing.T) {
found := false found := false
for _, d := range v.Diagnostics { for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "Missing mandatory field 'CustomField'") { if strings.Contains(d.Message, "CustomField: incomplete value") {
found = true found = true
break break
} }

View File

@@ -4,44 +4,11 @@ import (
"strings" "strings"
"testing" "testing"
"github.com/marte-dev/marte-dev-tools/internal/index" "github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-dev/marte-dev-tools/internal/parser" "github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-dev/marte-dev-tools/internal/validator" "github.com/marte-community/marte-dev-tools/internal/validator"
) )
func TestSchemaValidationMandatory(t *testing.T) {
// StateMachine requires "States"
content := `
+MySM = {
Class = StateMachine
// Missing States
}
`
p := parser.NewParser(content)
config, err := p.Parse()
if err != nil {
t.Fatalf("Parse failed: %v", err)
}
idx := index.NewProjectTree()
idx.AddFile("test.marte", config)
v := validator.NewValidator(idx, ".")
v.ValidateProject()
found := false
for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "Missing mandatory field 'States'") {
found = true
break
}
}
if !found {
t.Error("Expected error for missing mandatory field 'States', but found none")
}
}
func TestSchemaValidationType(t *testing.T) { func TestSchemaValidationType(t *testing.T) {
// OrderedClass: First (int), Second (string) // OrderedClass: First (int), Second (string)
content := ` content := `
@@ -65,7 +32,7 @@ func TestSchemaValidationType(t *testing.T) {
found := false found := false
for _, d := range v.Diagnostics { for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "Field 'First' expects type 'int'") { if strings.Contains(d.Message, "mismatched types") {
found = true found = true
break break
} }
@@ -105,8 +72,8 @@ func TestSchemaValidationOrder(t *testing.T) {
} }
} }
if !found { if found {
t.Error("Expected error for out-of-order fields, but found none") t.Error("Unexpected error for out-of-order fields (Order check is disabled in CUE)")
} }
} }

View File

@@ -0,0 +1,108 @@
package integration
import (
"strings"
"testing"
"github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-community/marte-dev-tools/internal/validator"
)
func TestSignalProperties(t *testing.T) {
content := `
+Data = {
Class = ReferenceContainer
+MyDS = {
Class = FileReader
Filename = "test"
Signals = {
Correct = { Type = uint32 NumberOfElements = 10 }
}
}
}
+MyGAM = {
Class = IOGAM
InputSignals = {
// Correct reference
Correct = { DataSource = MyDS Type = uint32 NumberOfElements = 10 }
// Mismatch Type
BadType = {
Alias = Correct
DataSource = MyDS
Type = float32 // Error
}
// Mismatch Elements
BadElements = {
Alias = Correct
DataSource = MyDS
Type = uint32
NumberOfElements = 20 // Error
}
// Valid Cast
//!cast(uint32, float32): Cast reason
CastSig = {
Alias = Correct
DataSource = MyDS
Type = float32 // OK
}
// Invalid Cast (Wrong definition type in pragma)
//!cast(int32, float32): Wrong def type
BadCast = {
Alias = Correct
DataSource = MyDS
Type = float32 // Error because pragma mismatch
}
}
}
`
p := parser.NewParser(content)
config, err := p.Parse()
if err != nil {
t.Fatalf("Parse failed: %v", err)
}
idx := index.NewProjectTree()
idx.AddFile("signal_props.marte", config)
idx.ResolveReferences()
v := validator.NewValidator(idx, ".")
v.ValidateProject()
foundBadType := false
foundBadElements := false
foundBadCast := false
for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "property 'Type' mismatch") {
if strings.Contains(d.Message, "'BadType'") {
foundBadType = true
}
if strings.Contains(d.Message, "'BadCast'") {
foundBadCast = true
}
if strings.Contains(d.Message, "'CastSig'") {
t.Error("Unexpected error for CastSig (should be suppressed by pragma)")
}
}
if strings.Contains(d.Message, "property 'NumberOfElements' mismatch") {
foundBadElements = true
}
}
if !foundBadType {
t.Error("Expected error for BadType")
}
if !foundBadElements {
t.Error("Expected error for BadElements")
}
if !foundBadCast {
t.Error("Expected error for BadCast (pragma mismatch)")
}
}

View File

@@ -0,0 +1,73 @@
package integration
import (
"strings"
"testing"
"github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-community/marte-dev-tools/internal/validator"
)
func TestSignalValidation(t *testing.T) {
content := `
+Data = {
Class = ReferenceContainer
+ValidDS = {
Class = DataSource
Signals = {
ValidSig = {
Type = uint32
}
}
}
+MissingTypeDS = {
Class = DataSource
Signals = {
InvalidSig = {
// Missing Type
Dummy = 1
}
}
}
+InvalidTypeDS = {
Class = DataSource
Signals = {
InvalidSig = {
Type = invalid_type
}
}
}
}
`
p := parser.NewParser(content)
config, err := p.Parse()
if err != nil {
t.Fatalf("Parse failed: %v", err)
}
idx := index.NewProjectTree()
idx.AddFile("signal_test.marte", config)
v := validator.NewValidator(idx, ".")
v.ValidateProject()
foundMissing := false
foundInvalid := false
for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "missing mandatory field 'Type'") {
foundMissing = true
}
if strings.Contains(d.Message, "Invalid Type 'invalid_type'") {
foundInvalid = true
}
}
if !foundMissing {
t.Error("Expected error for missing Type field in Signal")
}
if !foundInvalid {
t.Error("Expected error for invalid Type value in Signal")
}
}

View File

@@ -0,0 +1,59 @@
package integration
import (
"strings"
"testing"
"github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-community/marte-dev-tools/internal/validator"
)
func TestSignalsContentValidation(t *testing.T) {
content := `
+Data = {
Class = ReferenceContainer
+BadDS = {
Class = DataSource
Signals = {
BadField = 1
BadArray = { 1 2 }
// Valid signal
ValidSig = {
Type = uint32
}
}
}
}
`
p := parser.NewParser(content)
config, err := p.Parse()
if err != nil {
t.Fatalf("Parse failed: %v", err)
}
idx := index.NewProjectTree()
idx.AddFile("signals_content.marte", config)
v := validator.NewValidator(idx, ".")
v.ValidateProject()
foundBadField := false
foundBadArray := false
for _, d := range v.Diagnostics {
if strings.Contains(d.Message, "Field 'BadField' is not allowed") {
foundBadField = true
}
if strings.Contains(d.Message, "Field 'BadArray' is not allowed") {
foundBadArray = true
}
}
if !foundBadField {
t.Error("Expected error for BadField in Signals")
}
if !foundBadArray {
t.Error("Expected error for BadArray in Signals")
}
}

View File

@@ -3,9 +3,9 @@ package integration
import ( import (
"testing" "testing"
"github.com/marte-dev/marte-dev-tools/internal/index" "github.com/marte-community/marte-dev-tools/internal/index"
"github.com/marte-dev/marte-dev-tools/internal/parser" "github.com/marte-community/marte-dev-tools/internal/parser"
"github.com/marte-dev/marte-dev-tools/internal/validator" "github.com/marte-community/marte-dev-tools/internal/validator"
) )
func TestUnusedGAM(t *testing.T) { func TestUnusedGAM(t *testing.T) {
@@ -63,11 +63,13 @@ $App = {
$Data = { $Data = {
+MyDS = { +MyDS = {
Class = DataSourceClass Class = DataSourceClass
+Signals = {
Sig1 = { Type = uint32 } Sig1 = { Type = uint32 }
Sig2 = { Type = uint32 } Sig2 = { Type = uint32 }
} }
} }
} }
}
+MyGAM = { +MyGAM = {
Class = GAMClass Class = GAMClass
+InputSignals = { +InputSignals = {