Implemented operators and better indexing
This commit is contained in:
@@ -34,16 +34,16 @@ Responsible for converting MARTe configuration text into structured data.
|
|||||||
|
|
||||||
* **Lexer (`lexer.go`)**: Tokenizes the input stream. Handles MARTe specific syntax like `#package`, `//!` pragmas, and `//#` docstrings. Supports standard identifiers and `#`-prefixed identifiers.
|
* **Lexer (`lexer.go`)**: Tokenizes the input stream. Handles MARTe specific syntax like `#package`, `//!` pragmas, and `//#` docstrings. Supports standard identifiers and `#`-prefixed identifiers.
|
||||||
* **Parser (`parser.go`)**: Recursive descent parser. Converts tokens into a `Configuration` object containing definitions, comments, and pragmas.
|
* **Parser (`parser.go`)**: Recursive descent parser. Converts tokens into a `Configuration` object containing definitions, comments, and pragmas.
|
||||||
* **AST (`ast.go`)**: Defines the node types (`ObjectNode`, `Field`, `Value`, etc.). All nodes implement the `Node` interface providing position information.
|
* **AST (`ast.go`)**: Defines the node types (`ObjectNode`, `Field`, `Value`, `VariableDefinition`, etc.). All nodes implement the `Node` interface providing position information.
|
||||||
|
|
||||||
### 2. `internal/index`
|
### 2. `internal/index`
|
||||||
|
|
||||||
The brain of the system. It maintains a holistic view of the project.
|
The brain of the system. It maintains a holistic view of the project.
|
||||||
|
|
||||||
* **ProjectTree**: The central data structure. It holds the root of the configuration hierarchy (`Root`), references, and isolated files.
|
* **ProjectTree**: The central data structure. It holds the root of the configuration hierarchy (`Root`), references, and isolated files.
|
||||||
* **ProjectNode**: Represents a logical node in the configuration. Since a node can be defined across multiple files (fragments), `ProjectNode` aggregates these fragments.
|
* **ProjectNode**: Represents a logical node in the configuration. Since a node can be defined across multiple files (fragments), `ProjectNode` aggregates these fragments. It also stores locally defined variables in its `Variables` map.
|
||||||
* **NodeMap**: A hash map index (`map[string][]*ProjectNode`) for $O(1)$ symbol lookups, optimizing `FindNode` operations.
|
* **NodeMap**: A hash map index (`map[string][]*ProjectNode`) for $O(1)$ symbol lookups, optimizing `FindNode` operations.
|
||||||
* **Reference Resolution**: The `ResolveReferences` method links `Reference` objects to their target `ProjectNode` using the `NodeMap`.
|
* **Reference Resolution**: The `ResolveReferences` method links `Reference` objects to their target `ProjectNode` or `VariableDefinition`. It uses `resolveScopedName` to respect lexical scoping rules, searching up the hierarchy from the reference's container.
|
||||||
|
|
||||||
### 3. `internal/validator`
|
### 3. `internal/validator`
|
||||||
|
|
||||||
@@ -54,7 +54,9 @@ Ensures configuration correctness.
|
|||||||
* **Structure**: Duplicate fields, invalid content.
|
* **Structure**: Duplicate fields, invalid content.
|
||||||
* **Schema**: Unifies nodes with CUE schemas (loaded via `internal/schema`) to validate types and mandatory fields.
|
* **Schema**: Unifies nodes with CUE schemas (loaded via `internal/schema`) to validate types and mandatory fields.
|
||||||
* **Signals**: Verifies that signals referenced in GAMs exist in DataSources and match types.
|
* **Signals**: Verifies that signals referenced in GAMs exist in DataSources and match types.
|
||||||
* **Threading**: Checks `checkDataSourceThreading` to ensure non-multithreaded DataSources are not shared across threads in the same state.
|
* **Threading**: Checks `CheckDataSourceThreading` to ensure non-multithreaded DataSources are not shared across threads in the same state.
|
||||||
|
* **Ordering**: `CheckINOUTOrdering` verifies that for `INOUT` signals, the producing GAM appears before the consuming GAM in the thread's execution list.
|
||||||
|
* **Variables**: `CheckVariables` validates variable values against their defined CUE types (e.g. `uint`, regex). `CheckUnresolvedVariables` ensures all used variables are defined.
|
||||||
* **Unused**: Detects unused GAMs and Signals (suppressible via pragmas).
|
* **Unused**: Detects unused GAMs and Signals (suppressible via pragmas).
|
||||||
|
|
||||||
### 4. `internal/lsp`
|
### 4. `internal/lsp`
|
||||||
@@ -104,3 +106,11 @@ Manages CUE schemas.
|
|||||||
4. For each GAM, resolves connected `DataSources` via Input/Output signals.
|
4. For each GAM, resolves connected `DataSources` via Input/Output signals.
|
||||||
5. Maps `DataSource -> Thread` within the context of a State.
|
5. Maps `DataSource -> Thread` within the context of a State.
|
||||||
6. If a DataSource is seen in >1 Thread, it checks the `#meta.multithreaded` property. If false (default), an error is raised.
|
6. If a DataSource is seen in >1 Thread, it checks the `#meta.multithreaded` property. If false (default), an error is raised.
|
||||||
|
|
||||||
|
### INOUT Ordering Logic
|
||||||
|
1. Iterates Threads.
|
||||||
|
2. Iterates GAMs in execution order.
|
||||||
|
3. Tracks `producedSignals` and `consumedSignals`.
|
||||||
|
4. For each GAM, checks Inputs. If Input is `INOUT` (and not multithreaded) and not in `producedSignals`, reports "Consumed before Produced" error.
|
||||||
|
5. Registers Outputs in `producedSignals`.
|
||||||
|
6. At end of thread, checks for signals that were produced but never consumed, reporting a warning.
|
||||||
|
|||||||
@@ -179,6 +179,17 @@ Reference a variable using `$`:
|
|||||||
Field = $MyVar
|
Field = $MyVar
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### Expressions
|
||||||
|
You can use operators in field values. Supported operators:
|
||||||
|
- **Math**: `+`, `-`, `*`, `/`, `%`, `^` (XOR), `&`, `|` (Bitwise)
|
||||||
|
- **String Concatenation**: `..`
|
||||||
|
|
||||||
|
```marte
|
||||||
|
Field1 = 10 + 20 * 2 // 50
|
||||||
|
Field2 = "Hello " .. "World"
|
||||||
|
Field3 = $MyVar + 5
|
||||||
|
```
|
||||||
|
|
||||||
### Build Override
|
### Build Override
|
||||||
You can override variable values during build:
|
You can override variable values during build:
|
||||||
|
|
||||||
|
|||||||
@@ -158,6 +158,7 @@ func (b *Builder) writeDefinition(f *os.File, def parser.Definition, indent int)
|
|||||||
}
|
}
|
||||||
|
|
||||||
func (b *Builder) formatValue(val parser.Value) string {
|
func (b *Builder) formatValue(val parser.Value) string {
|
||||||
|
val = b.evaluate(val)
|
||||||
switch v := val.(type) {
|
switch v := val.(type) {
|
||||||
case *parser.StringValue:
|
case *parser.StringValue:
|
||||||
if v.Quoted {
|
if v.Quoted {
|
||||||
@@ -171,10 +172,6 @@ func (b *Builder) formatValue(val parser.Value) string {
|
|||||||
case *parser.BoolValue:
|
case *parser.BoolValue:
|
||||||
return fmt.Sprintf("%v", v.Value)
|
return fmt.Sprintf("%v", v.Value)
|
||||||
case *parser.VariableReferenceValue:
|
case *parser.VariableReferenceValue:
|
||||||
name := strings.TrimPrefix(v.Name, "$")
|
|
||||||
if val, ok := b.variables[name]; ok {
|
|
||||||
return b.formatValue(val)
|
|
||||||
}
|
|
||||||
return v.Name
|
return v.Name
|
||||||
case *parser.ReferenceValue:
|
case *parser.ReferenceValue:
|
||||||
return v.Value
|
return v.Value
|
||||||
@@ -234,3 +231,108 @@ func (b *Builder) collectVariables(tree *index.ProjectTree) {
|
|||||||
}
|
}
|
||||||
tree.Walk(processNode)
|
tree.Walk(processNode)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (b *Builder) evaluate(val parser.Value) parser.Value {
|
||||||
|
switch v := val.(type) {
|
||||||
|
case *parser.VariableReferenceValue:
|
||||||
|
name := strings.TrimPrefix(v.Name, "$")
|
||||||
|
if res, ok := b.variables[name]; ok {
|
||||||
|
return b.evaluate(res)
|
||||||
|
}
|
||||||
|
return v
|
||||||
|
case *parser.BinaryExpression:
|
||||||
|
left := b.evaluate(v.Left)
|
||||||
|
right := b.evaluate(v.Right)
|
||||||
|
return b.compute(left, v.Operator, right)
|
||||||
|
}
|
||||||
|
return val
|
||||||
|
}
|
||||||
|
|
||||||
|
func (b *Builder) compute(left parser.Value, op parser.Token, right parser.Value) parser.Value {
|
||||||
|
if op.Type == parser.TokenConcat {
|
||||||
|
s1 := b.valToString(left)
|
||||||
|
s2 := b.valToString(right)
|
||||||
|
return &parser.StringValue{Value: s1 + s2, Quoted: true}
|
||||||
|
}
|
||||||
|
|
||||||
|
lF, lIsF := b.valToFloat(left)
|
||||||
|
rF, rIsF := b.valToFloat(right)
|
||||||
|
|
||||||
|
if lIsF || rIsF {
|
||||||
|
res := 0.0
|
||||||
|
switch op.Type {
|
||||||
|
case parser.TokenPlus:
|
||||||
|
res = lF + rF
|
||||||
|
case parser.TokenMinus:
|
||||||
|
res = lF - rF
|
||||||
|
case parser.TokenStar:
|
||||||
|
res = lF * rF
|
||||||
|
case parser.TokenSlash:
|
||||||
|
res = lF / rF
|
||||||
|
}
|
||||||
|
return &parser.FloatValue{Value: res, Raw: fmt.Sprintf("%g", res)}
|
||||||
|
}
|
||||||
|
|
||||||
|
lI, lIsI := b.valToInt(left)
|
||||||
|
rI, rIsI := b.valToInt(right)
|
||||||
|
|
||||||
|
if lIsI && rIsI {
|
||||||
|
res := int64(0)
|
||||||
|
switch op.Type {
|
||||||
|
case parser.TokenPlus:
|
||||||
|
res = lI + rI
|
||||||
|
case parser.TokenMinus:
|
||||||
|
res = lI - rI
|
||||||
|
case parser.TokenStar:
|
||||||
|
res = lI * rI
|
||||||
|
case parser.TokenSlash:
|
||||||
|
if rI != 0 {
|
||||||
|
res = lI / rI
|
||||||
|
}
|
||||||
|
case parser.TokenPercent:
|
||||||
|
if rI != 0 {
|
||||||
|
res = lI % rI
|
||||||
|
}
|
||||||
|
case parser.TokenAmpersand:
|
||||||
|
res = lI & rI
|
||||||
|
case parser.TokenPipe:
|
||||||
|
res = lI | rI
|
||||||
|
case parser.TokenCaret:
|
||||||
|
res = lI ^ rI
|
||||||
|
}
|
||||||
|
return &parser.IntValue{Value: res, Raw: fmt.Sprintf("%d", res)}
|
||||||
|
}
|
||||||
|
|
||||||
|
return left
|
||||||
|
}
|
||||||
|
|
||||||
|
func (b *Builder) valToString(v parser.Value) string {
|
||||||
|
switch val := v.(type) {
|
||||||
|
case *parser.StringValue:
|
||||||
|
return val.Value
|
||||||
|
case *parser.IntValue:
|
||||||
|
return val.Raw
|
||||||
|
case *parser.FloatValue:
|
||||||
|
return val.Raw
|
||||||
|
default:
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (b *Builder) valToFloat(v parser.Value) (float64, bool) {
|
||||||
|
switch val := v.(type) {
|
||||||
|
case *parser.FloatValue:
|
||||||
|
return val.Value, true
|
||||||
|
case *parser.IntValue:
|
||||||
|
return float64(val.Value), true
|
||||||
|
}
|
||||||
|
return 0, false
|
||||||
|
}
|
||||||
|
|
||||||
|
func (b *Builder) valToInt(v parser.Value) (int64, bool) {
|
||||||
|
switch val := v.(type) {
|
||||||
|
case *parser.IntValue:
|
||||||
|
return val.Value, true
|
||||||
|
}
|
||||||
|
return 0, false
|
||||||
|
}
|
||||||
|
|||||||
@@ -19,7 +19,6 @@ type ProjectTree struct {
|
|||||||
IsolatedFiles map[string]*ProjectNode
|
IsolatedFiles map[string]*ProjectNode
|
||||||
GlobalPragmas map[string][]string
|
GlobalPragmas map[string][]string
|
||||||
NodeMap map[string][]*ProjectNode
|
NodeMap map[string][]*ProjectNode
|
||||||
Variables map[string]VariableInfo
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func (pt *ProjectTree) ScanDirectory(rootPath string) error {
|
func (pt *ProjectTree) ScanDirectory(rootPath string) error {
|
||||||
@@ -48,6 +47,7 @@ type Reference struct {
|
|||||||
File string
|
File string
|
||||||
Target *ProjectNode
|
Target *ProjectNode
|
||||||
TargetVariable *parser.VariableDefinition
|
TargetVariable *parser.VariableDefinition
|
||||||
|
IsVariable bool
|
||||||
}
|
}
|
||||||
|
|
||||||
type ProjectNode struct {
|
type ProjectNode struct {
|
||||||
@@ -60,6 +60,7 @@ type ProjectNode struct {
|
|||||||
Metadata map[string]string // Store extra info like Class, Type, Size
|
Metadata map[string]string // Store extra info like Class, Type, Size
|
||||||
Target *ProjectNode // Points to referenced node (for Direct References/Links)
|
Target *ProjectNode // Points to referenced node (for Direct References/Links)
|
||||||
Pragmas []string
|
Pragmas []string
|
||||||
|
Variables map[string]VariableInfo
|
||||||
}
|
}
|
||||||
|
|
||||||
type Fragment struct {
|
type Fragment struct {
|
||||||
@@ -76,10 +77,10 @@ func NewProjectTree() *ProjectTree {
|
|||||||
Root: &ProjectNode{
|
Root: &ProjectNode{
|
||||||
Children: make(map[string]*ProjectNode),
|
Children: make(map[string]*ProjectNode),
|
||||||
Metadata: make(map[string]string),
|
Metadata: make(map[string]string),
|
||||||
|
Variables: make(map[string]VariableInfo),
|
||||||
},
|
},
|
||||||
IsolatedFiles: make(map[string]*ProjectNode),
|
IsolatedFiles: make(map[string]*ProjectNode),
|
||||||
GlobalPragmas: make(map[string][]string),
|
GlobalPragmas: make(map[string][]string),
|
||||||
Variables: make(map[string]VariableInfo),
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -184,6 +185,7 @@ func (pt *ProjectTree) AddFile(file string, config *parser.Configuration) {
|
|||||||
node := &ProjectNode{
|
node := &ProjectNode{
|
||||||
Children: make(map[string]*ProjectNode),
|
Children: make(map[string]*ProjectNode),
|
||||||
Metadata: make(map[string]string),
|
Metadata: make(map[string]string),
|
||||||
|
Variables: make(map[string]VariableInfo),
|
||||||
}
|
}
|
||||||
pt.IsolatedFiles[file] = node
|
pt.IsolatedFiles[file] = node
|
||||||
pt.populateNode(node, file, config)
|
pt.populateNode(node, file, config)
|
||||||
@@ -205,6 +207,7 @@ func (pt *ProjectTree) AddFile(file string, config *parser.Configuration) {
|
|||||||
Children: make(map[string]*ProjectNode),
|
Children: make(map[string]*ProjectNode),
|
||||||
Parent: node,
|
Parent: node,
|
||||||
Metadata: make(map[string]string),
|
Metadata: make(map[string]string),
|
||||||
|
Variables: make(map[string]VariableInfo),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
node = node.Children[part]
|
node = node.Children[part]
|
||||||
@@ -229,7 +232,7 @@ func (pt *ProjectTree) populateNode(node *ProjectNode, file string, config *pars
|
|||||||
pt.indexValue(file, d.Value)
|
pt.indexValue(file, d.Value)
|
||||||
case *parser.VariableDefinition:
|
case *parser.VariableDefinition:
|
||||||
fileFragment.Definitions = append(fileFragment.Definitions, d)
|
fileFragment.Definitions = append(fileFragment.Definitions, d)
|
||||||
pt.Variables[d.Name] = VariableInfo{Def: d, File: file}
|
node.Variables[d.Name] = VariableInfo{Def: d, File: file}
|
||||||
case *parser.ObjectNode:
|
case *parser.ObjectNode:
|
||||||
fileFragment.Definitions = append(fileFragment.Definitions, d)
|
fileFragment.Definitions = append(fileFragment.Definitions, d)
|
||||||
norm := NormalizeName(d.Name)
|
norm := NormalizeName(d.Name)
|
||||||
@@ -240,6 +243,7 @@ func (pt *ProjectTree) populateNode(node *ProjectNode, file string, config *pars
|
|||||||
Children: make(map[string]*ProjectNode),
|
Children: make(map[string]*ProjectNode),
|
||||||
Parent: node,
|
Parent: node,
|
||||||
Metadata: make(map[string]string),
|
Metadata: make(map[string]string),
|
||||||
|
Variables: make(map[string]VariableInfo),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
child := node.Children[norm]
|
child := node.Children[norm]
|
||||||
@@ -287,7 +291,7 @@ func (pt *ProjectTree) addObjectFragment(node *ProjectNode, file string, obj *pa
|
|||||||
pt.extractFieldMetadata(node, d)
|
pt.extractFieldMetadata(node, d)
|
||||||
case *parser.VariableDefinition:
|
case *parser.VariableDefinition:
|
||||||
frag.Definitions = append(frag.Definitions, d)
|
frag.Definitions = append(frag.Definitions, d)
|
||||||
pt.Variables[d.Name] = VariableInfo{Def: d, File: file}
|
node.Variables[d.Name] = VariableInfo{Def: d, File: file}
|
||||||
case *parser.ObjectNode:
|
case *parser.ObjectNode:
|
||||||
frag.Definitions = append(frag.Definitions, d)
|
frag.Definitions = append(frag.Definitions, d)
|
||||||
norm := NormalizeName(d.Name)
|
norm := NormalizeName(d.Name)
|
||||||
@@ -298,6 +302,7 @@ func (pt *ProjectTree) addObjectFragment(node *ProjectNode, file string, obj *pa
|
|||||||
Children: make(map[string]*ProjectNode),
|
Children: make(map[string]*ProjectNode),
|
||||||
Parent: node,
|
Parent: node,
|
||||||
Metadata: make(map[string]string),
|
Metadata: make(map[string]string),
|
||||||
|
Variables: make(map[string]VariableInfo),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
child := node.Children[norm]
|
child := node.Children[norm]
|
||||||
@@ -398,6 +403,7 @@ func (pt *ProjectTree) indexValue(file string, val parser.Value) {
|
|||||||
Name: strings.TrimPrefix(v.Name, "$"),
|
Name: strings.TrimPrefix(v.Name, "$"),
|
||||||
Position: v.Position,
|
Position: v.Position,
|
||||||
File: file,
|
File: file,
|
||||||
|
IsVariable: true,
|
||||||
})
|
})
|
||||||
case *parser.ArrayValue:
|
case *parser.ArrayValue:
|
||||||
for _, elem := range v.Elements {
|
for _, elem := range v.Elements {
|
||||||
@@ -422,12 +428,13 @@ func (pt *ProjectTree) ResolveReferences() {
|
|||||||
for i := range pt.References {
|
for i := range pt.References {
|
||||||
ref := &pt.References[i]
|
ref := &pt.References[i]
|
||||||
|
|
||||||
if v, ok := pt.Variables[ref.Name]; ok {
|
container := pt.GetNodeContaining(ref.File, ref.Position)
|
||||||
|
|
||||||
|
if v := pt.ResolveVariable(container, ref.Name); v != nil {
|
||||||
ref.TargetVariable = v.Def
|
ref.TargetVariable = v.Def
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
|
|
||||||
container := pt.GetNodeContaining(ref.File, ref.Position)
|
|
||||||
ref.Target = pt.resolveScopedName(container, ref.Name)
|
ref.Target = pt.resolveScopedName(container, ref.Name)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -637,7 +644,12 @@ func (pt *ProjectTree) resolveScopedName(ctx *ProjectNode, name string) *Project
|
|||||||
}
|
}
|
||||||
|
|
||||||
if startNode == nil {
|
if startNode == nil {
|
||||||
return nil
|
// Fallback to deep search from context root
|
||||||
|
root := ctx
|
||||||
|
for root.Parent != nil {
|
||||||
|
root = root.Parent
|
||||||
|
}
|
||||||
|
return pt.FindNode(root, name, nil)
|
||||||
}
|
}
|
||||||
|
|
||||||
curr = startNode
|
curr = startNode
|
||||||
@@ -651,3 +663,19 @@ func (pt *ProjectTree) resolveScopedName(ctx *ProjectNode, name string) *Project
|
|||||||
}
|
}
|
||||||
return curr
|
return curr
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (pt *ProjectTree) ResolveVariable(ctx *ProjectNode, name string) *VariableInfo {
|
||||||
|
curr := ctx
|
||||||
|
for curr != nil {
|
||||||
|
if v, ok := curr.Variables[name]; ok {
|
||||||
|
return &v
|
||||||
|
}
|
||||||
|
curr = curr.Parent
|
||||||
|
}
|
||||||
|
if ctx == nil {
|
||||||
|
if v, ok := pt.Root.Variables[name]; ok {
|
||||||
|
return &v
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|||||||
@@ -248,8 +248,11 @@ func HandleMessage(msg *JsonRpcMessage) {
|
|||||||
if err := Tree.ScanDirectory(root); err != nil {
|
if err := Tree.ScanDirectory(root); err != nil {
|
||||||
logger.Printf("ScanDirectory failed: %v\n", err)
|
logger.Printf("ScanDirectory failed: %v\n", err)
|
||||||
}
|
}
|
||||||
|
logger.Printf("Scan done")
|
||||||
Tree.ResolveReferences()
|
Tree.ResolveReferences()
|
||||||
|
logger.Printf("Resolve done")
|
||||||
GlobalSchema = schema.LoadFullSchema(ProjectRoot)
|
GlobalSchema = schema.LoadFullSchema(ProjectRoot)
|
||||||
|
logger.Printf("Schema done")
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -1064,7 +1067,8 @@ func HandleDefinition(params DefinitionParams) any {
|
|||||||
}
|
}
|
||||||
|
|
||||||
if targetVar != nil {
|
if targetVar != nil {
|
||||||
if info, ok := Tree.Variables[targetVar.Name]; ok {
|
container := Tree.GetNodeContaining(path, parser.Position{Line: line, Column: col})
|
||||||
|
if info := Tree.ResolveVariable(container, targetVar.Name); info != nil {
|
||||||
return []Location{{
|
return []Location{{
|
||||||
URI: "file://" + info.File,
|
URI: "file://" + info.File,
|
||||||
Range: Range{
|
Range: Range{
|
||||||
@@ -1123,7 +1127,8 @@ func HandleReferences(params ReferenceParams) []Location {
|
|||||||
var locations []Location
|
var locations []Location
|
||||||
// Declaration
|
// Declaration
|
||||||
if params.Context.IncludeDeclaration {
|
if params.Context.IncludeDeclaration {
|
||||||
if info, ok := Tree.Variables[targetVar.Name]; ok {
|
container := Tree.GetNodeContaining(path, parser.Position{Line: line, Column: col})
|
||||||
|
if info := Tree.ResolveVariable(container, targetVar.Name); info != nil {
|
||||||
locations = append(locations, Location{
|
locations = append(locations, Location{
|
||||||
URI: "file://" + info.File,
|
URI: "file://" + info.File,
|
||||||
Range: Range{
|
Range: Range{
|
||||||
|
|||||||
@@ -143,3 +143,13 @@ type VariableReferenceValue struct {
|
|||||||
|
|
||||||
func (v *VariableReferenceValue) Pos() Position { return v.Position }
|
func (v *VariableReferenceValue) Pos() Position { return v.Position }
|
||||||
func (v *VariableReferenceValue) isValue() {}
|
func (v *VariableReferenceValue) isValue() {}
|
||||||
|
|
||||||
|
type BinaryExpression struct {
|
||||||
|
Position Position
|
||||||
|
Left Value
|
||||||
|
Operator Token
|
||||||
|
Right Value
|
||||||
|
}
|
||||||
|
|
||||||
|
func (b *BinaryExpression) Pos() Position { return b.Position }
|
||||||
|
func (b *BinaryExpression) isValue() {}
|
||||||
|
|||||||
@@ -28,6 +28,14 @@ const (
|
|||||||
TokenLBracket
|
TokenLBracket
|
||||||
TokenRBracket
|
TokenRBracket
|
||||||
TokenSymbol
|
TokenSymbol
|
||||||
|
TokenPlus
|
||||||
|
TokenMinus
|
||||||
|
TokenStar
|
||||||
|
TokenSlash
|
||||||
|
TokenPercent
|
||||||
|
TokenCaret
|
||||||
|
TokenAmpersand
|
||||||
|
TokenConcat
|
||||||
)
|
)
|
||||||
|
|
||||||
type Token struct {
|
type Token struct {
|
||||||
@@ -137,16 +145,45 @@ func (l *Lexer) NextToken() Token {
|
|||||||
return l.emit(TokenLBracket)
|
return l.emit(TokenLBracket)
|
||||||
case ']':
|
case ']':
|
||||||
return l.emit(TokenRBracket)
|
return l.emit(TokenRBracket)
|
||||||
case '&', '?', '!', '<', '>', '*', '(', ')', '~', '%', '^':
|
case '+':
|
||||||
|
if unicode.IsSpace(l.peek()) {
|
||||||
|
return l.emit(TokenPlus)
|
||||||
|
}
|
||||||
|
return l.lexObjectIdentifier()
|
||||||
|
case '-':
|
||||||
|
if unicode.IsDigit(l.peek()) {
|
||||||
|
return l.lexNumber()
|
||||||
|
}
|
||||||
|
if unicode.IsSpace(l.peek()) {
|
||||||
|
return l.emit(TokenMinus)
|
||||||
|
}
|
||||||
|
return l.lexIdentifier()
|
||||||
|
case '*':
|
||||||
|
return l.emit(TokenStar)
|
||||||
|
case '/':
|
||||||
|
p := l.peek()
|
||||||
|
if p == '/' || p == '*' || p == '#' || p == '!' {
|
||||||
|
return l.lexComment()
|
||||||
|
}
|
||||||
|
return l.emit(TokenSlash)
|
||||||
|
case '%':
|
||||||
|
return l.emit(TokenPercent)
|
||||||
|
case '^':
|
||||||
|
return l.emit(TokenCaret)
|
||||||
|
case '&':
|
||||||
|
return l.emit(TokenAmpersand)
|
||||||
|
case '.':
|
||||||
|
if l.peek() == '.' {
|
||||||
|
l.next()
|
||||||
|
return l.emit(TokenConcat)
|
||||||
|
}
|
||||||
|
return l.emit(TokenSymbol)
|
||||||
|
case '~', '!', '<', '>', '(', ')', '?', '\\':
|
||||||
return l.emit(TokenSymbol)
|
return l.emit(TokenSymbol)
|
||||||
case '"':
|
case '"':
|
||||||
return l.lexString()
|
return l.lexString()
|
||||||
case '/':
|
|
||||||
return l.lexComment()
|
|
||||||
case '#':
|
case '#':
|
||||||
return l.lexHashIdentifier()
|
return l.lexHashIdentifier()
|
||||||
case '+':
|
|
||||||
fallthrough
|
|
||||||
case '$':
|
case '$':
|
||||||
return l.lexObjectIdentifier()
|
return l.lexObjectIdentifier()
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -226,6 +226,56 @@ func (p *Parser) parseSubnode() (Subnode, bool) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
func (p *Parser) parseValue() (Value, bool) {
|
func (p *Parser) parseValue() (Value, bool) {
|
||||||
|
return p.parseExpression(0)
|
||||||
|
}
|
||||||
|
|
||||||
|
func getPrecedence(t TokenType) int {
|
||||||
|
switch t {
|
||||||
|
case TokenStar, TokenSlash, TokenPercent:
|
||||||
|
return 5
|
||||||
|
case TokenPlus, TokenMinus:
|
||||||
|
return 4
|
||||||
|
case TokenConcat:
|
||||||
|
return 3
|
||||||
|
case TokenAmpersand:
|
||||||
|
return 2
|
||||||
|
case TokenPipe, TokenCaret:
|
||||||
|
return 1
|
||||||
|
default:
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *Parser) parseExpression(minPrecedence int) (Value, bool) {
|
||||||
|
left, ok := p.parseAtom()
|
||||||
|
if !ok {
|
||||||
|
return nil, false
|
||||||
|
}
|
||||||
|
|
||||||
|
for {
|
||||||
|
t := p.peek()
|
||||||
|
prec := getPrecedence(t.Type)
|
||||||
|
if prec == 0 || prec <= minPrecedence {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
p.next()
|
||||||
|
|
||||||
|
right, ok := p.parseExpression(prec)
|
||||||
|
if !ok {
|
||||||
|
return nil, false
|
||||||
|
}
|
||||||
|
|
||||||
|
left = &BinaryExpression{
|
||||||
|
Position: left.Pos(),
|
||||||
|
Left: left,
|
||||||
|
Operator: t,
|
||||||
|
Right: right,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return left, true
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *Parser) parseAtom() (Value, bool) {
|
||||||
tok := p.next()
|
tok := p.next()
|
||||||
switch tok.Type {
|
switch tok.Type {
|
||||||
case TokenString:
|
case TokenString:
|
||||||
|
|||||||
@@ -57,6 +57,7 @@ func (v *Validator) ValidateProject() {
|
|||||||
v.CheckDataSourceThreading()
|
v.CheckDataSourceThreading()
|
||||||
v.CheckINOUTOrdering()
|
v.CheckINOUTOrdering()
|
||||||
v.CheckVariables()
|
v.CheckVariables()
|
||||||
|
v.CheckUnresolvedVariables()
|
||||||
}
|
}
|
||||||
|
|
||||||
func (v *Validator) validateNode(node *index.ProjectNode) {
|
func (v *Validator) validateNode(node *index.ProjectNode) {
|
||||||
@@ -95,7 +96,7 @@ func (v *Validator) validateNode(node *index.ProjectNode) {
|
|||||||
className := ""
|
className := ""
|
||||||
if node.RealName != "" && (node.RealName[0] == '+' || node.RealName[0] == '$') {
|
if node.RealName != "" && (node.RealName[0] == '+' || node.RealName[0] == '$') {
|
||||||
if classFields, ok := fields["Class"]; ok && len(classFields) > 0 {
|
if classFields, ok := fields["Class"]; ok && len(classFields) > 0 {
|
||||||
className = v.getFieldValue(classFields[0])
|
className = v.getFieldValue(classFields[0], node)
|
||||||
}
|
}
|
||||||
|
|
||||||
hasType := false
|
hasType := false
|
||||||
@@ -188,7 +189,7 @@ func (v *Validator) nodeToMap(node *index.ProjectNode) map[string]interface{} {
|
|||||||
for name, defs := range fields {
|
for name, defs := range fields {
|
||||||
if len(defs) > 0 {
|
if len(defs) > 0 {
|
||||||
// Use the last definition (duplicates checked elsewhere)
|
// Use the last definition (duplicates checked elsewhere)
|
||||||
m[name] = v.valueToInterface(defs[len(defs)-1].Value)
|
m[name] = v.valueToInterface(defs[len(defs)-1].Value, node)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -207,13 +208,13 @@ func (v *Validator) nodeToMap(node *index.ProjectNode) map[string]interface{} {
|
|||||||
return m
|
return m
|
||||||
}
|
}
|
||||||
|
|
||||||
func (v *Validator) valueToInterface(val parser.Value) interface{} {
|
func (v *Validator) valueToInterface(val parser.Value, ctx *index.ProjectNode) interface{} {
|
||||||
switch t := val.(type) {
|
switch t := val.(type) {
|
||||||
case *parser.StringValue:
|
case *parser.StringValue:
|
||||||
return t.Value
|
return t.Value
|
||||||
case *parser.IntValue:
|
case *parser.IntValue:
|
||||||
i, _ := strconv.ParseInt(t.Raw, 0, 64)
|
i, _ := strconv.ParseInt(t.Raw, 0, 64)
|
||||||
return i // CUE handles int64
|
return i
|
||||||
case *parser.FloatValue:
|
case *parser.FloatValue:
|
||||||
f, _ := strconv.ParseFloat(t.Raw, 64)
|
f, _ := strconv.ParseFloat(t.Raw, 64)
|
||||||
return f
|
return f
|
||||||
@@ -223,16 +224,16 @@ func (v *Validator) valueToInterface(val parser.Value) interface{} {
|
|||||||
return t.Value
|
return t.Value
|
||||||
case *parser.VariableReferenceValue:
|
case *parser.VariableReferenceValue:
|
||||||
name := strings.TrimPrefix(t.Name, "$")
|
name := strings.TrimPrefix(t.Name, "$")
|
||||||
if info, ok := v.Tree.Variables[name]; ok {
|
if info := v.Tree.ResolveVariable(ctx, name); info != nil {
|
||||||
if info.Def.DefaultValue != nil {
|
if info.Def.DefaultValue != nil {
|
||||||
return v.valueToInterface(info.Def.DefaultValue)
|
return v.valueToInterface(info.Def.DefaultValue, ctx)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
return nil
|
return nil
|
||||||
case *parser.ArrayValue:
|
case *parser.ArrayValue:
|
||||||
var arr []interface{}
|
var arr []interface{}
|
||||||
for _, e := range t.Elements {
|
for _, e := range t.Elements {
|
||||||
arr = append(arr, v.valueToInterface(e))
|
arr = append(arr, v.valueToInterface(e, ctx))
|
||||||
}
|
}
|
||||||
return arr
|
return arr
|
||||||
}
|
}
|
||||||
@@ -296,7 +297,7 @@ func (v *Validator) validateGAMSignal(gamNode, signalNode *index.ProjectNode, di
|
|||||||
fields := v.getFields(signalNode)
|
fields := v.getFields(signalNode)
|
||||||
var dsName string
|
var dsName string
|
||||||
if dsFields, ok := fields["DataSource"]; ok && len(dsFields) > 0 {
|
if dsFields, ok := fields["DataSource"]; ok && len(dsFields) > 0 {
|
||||||
dsName = v.getFieldValue(dsFields[0])
|
dsName = v.getFieldValue(dsFields[0], signalNode)
|
||||||
}
|
}
|
||||||
|
|
||||||
if dsName == "" {
|
if dsName == "" {
|
||||||
@@ -355,7 +356,7 @@ func (v *Validator) validateGAMSignal(gamNode, signalNode *index.ProjectNode, di
|
|||||||
// Check Signal Existence
|
// Check Signal Existence
|
||||||
targetSignalName := index.NormalizeName(signalNode.RealName)
|
targetSignalName := index.NormalizeName(signalNode.RealName)
|
||||||
if aliasFields, ok := fields["Alias"]; ok && len(aliasFields) > 0 {
|
if aliasFields, ok := fields["Alias"]; ok && len(aliasFields) > 0 {
|
||||||
targetSignalName = v.getFieldValue(aliasFields[0]) // Alias is usually the name in DataSource
|
targetSignalName = v.getFieldValue(aliasFields[0], signalNode) // Alias is usually the name in DataSource
|
||||||
}
|
}
|
||||||
|
|
||||||
var targetNode *index.ProjectNode
|
var targetNode *index.ProjectNode
|
||||||
@@ -404,7 +405,7 @@ func (v *Validator) validateGAMSignal(gamNode, signalNode *index.ProjectNode, di
|
|||||||
})
|
})
|
||||||
} else {
|
} else {
|
||||||
// Check Type validity even for implicit
|
// Check Type validity even for implicit
|
||||||
typeVal := v.getFieldValue(typeFields[0])
|
typeVal := v.getFieldValue(typeFields[0], signalNode)
|
||||||
if !isValidType(typeVal) {
|
if !isValidType(typeVal) {
|
||||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||||
Level: LevelError,
|
Level: LevelError,
|
||||||
@@ -430,7 +431,7 @@ func (v *Validator) validateGAMSignal(gamNode, signalNode *index.ProjectNode, di
|
|||||||
|
|
||||||
// Check Type validity if present
|
// Check Type validity if present
|
||||||
if typeFields, ok := fields["Type"]; ok && len(typeFields) > 0 {
|
if typeFields, ok := fields["Type"]; ok && len(typeFields) > 0 {
|
||||||
typeVal := v.getFieldValue(typeFields[0])
|
typeVal := v.getFieldValue(typeFields[0], signalNode)
|
||||||
if !isValidType(typeVal) {
|
if !isValidType(typeVal) {
|
||||||
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||||
Level: LevelError,
|
Level: LevelError,
|
||||||
@@ -511,7 +512,7 @@ func (v *Validator) getFields(node *index.ProjectNode) map[string][]*parser.Fiel
|
|||||||
return fields
|
return fields
|
||||||
}
|
}
|
||||||
|
|
||||||
func (v *Validator) getFieldValue(f *parser.Field) string {
|
func (v *Validator) getFieldValue(f *parser.Field, ctx *index.ProjectNode) string {
|
||||||
switch val := f.Value.(type) {
|
switch val := f.Value.(type) {
|
||||||
case *parser.StringValue:
|
case *parser.StringValue:
|
||||||
return val.Value
|
return val.Value
|
||||||
@@ -523,6 +524,13 @@ func (v *Validator) getFieldValue(f *parser.Field) string {
|
|||||||
return val.Raw
|
return val.Raw
|
||||||
case *parser.BoolValue:
|
case *parser.BoolValue:
|
||||||
return strconv.FormatBool(val.Value)
|
return strconv.FormatBool(val.Value)
|
||||||
|
case *parser.VariableReferenceValue:
|
||||||
|
name := strings.TrimPrefix(val.Name, "$")
|
||||||
|
if info := v.Tree.ResolveVariable(ctx, name); info != nil {
|
||||||
|
if info.Def.DefaultValue != nil {
|
||||||
|
return v.getFieldValue(&parser.Field{Value: info.Def.DefaultValue}, ctx)
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
return ""
|
return ""
|
||||||
}
|
}
|
||||||
@@ -865,7 +873,7 @@ func (v *Validator) getGAMDataSources(gam *index.ProjectNode) []*index.ProjectNo
|
|||||||
for _, sig := range container.Children {
|
for _, sig := range container.Children {
|
||||||
fields := v.getFields(sig)
|
fields := v.getFields(sig)
|
||||||
if dsFields, ok := fields["DataSource"]; ok && len(dsFields) > 0 {
|
if dsFields, ok := fields["DataSource"]; ok && len(dsFields) > 0 {
|
||||||
dsName := v.getFieldValue(dsFields[0])
|
dsName := v.getFieldValue(dsFields[0], sig)
|
||||||
dsNode := v.resolveReference(dsName, v.getNodeFile(sig), isDataSource)
|
dsNode := v.resolveReference(dsName, v.getNodeFile(sig), isDataSource)
|
||||||
if dsNode != nil {
|
if dsNode != nil {
|
||||||
dsMap[dsNode] = true
|
dsMap[dsNode] = true
|
||||||
@@ -888,7 +896,7 @@ func (v *Validator) isMultithreaded(ds *index.ProjectNode) bool {
|
|||||||
if meta, ok := ds.Children["#meta"]; ok {
|
if meta, ok := ds.Children["#meta"]; ok {
|
||||||
fields := v.getFields(meta)
|
fields := v.getFields(meta)
|
||||||
if mt, ok := fields["multithreaded"]; ok && len(mt) > 0 {
|
if mt, ok := fields["multithreaded"]; ok && len(mt) > 0 {
|
||||||
val := v.getFieldValue(mt[0])
|
val := v.getFieldValue(mt[0], meta)
|
||||||
return val == "true"
|
return val == "true"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -999,11 +1007,11 @@ func (v *Validator) processGAMSignalsForOrdering(gam *index.ProjectNode, contain
|
|||||||
|
|
||||||
if dsNode == nil {
|
if dsNode == nil {
|
||||||
if dsFields, ok := fields["DataSource"]; ok && len(dsFields) > 0 {
|
if dsFields, ok := fields["DataSource"]; ok && len(dsFields) > 0 {
|
||||||
dsName := v.getFieldValue(dsFields[0])
|
dsName := v.getFieldValue(dsFields[0], sig)
|
||||||
dsNode = v.resolveReference(dsName, v.getNodeFile(sig), isDataSource)
|
dsNode = v.resolveReference(dsName, v.getNodeFile(sig), isDataSource)
|
||||||
}
|
}
|
||||||
if aliasFields, ok := fields["Alias"]; ok && len(aliasFields) > 0 {
|
if aliasFields, ok := fields["Alias"]; ok && len(aliasFields) > 0 {
|
||||||
sigName = v.getFieldValue(aliasFields[0])
|
sigName = v.getFieldValue(aliasFields[0], sig)
|
||||||
} else {
|
} else {
|
||||||
sigName = sig.RealName
|
sigName = sig.RealName
|
||||||
}
|
}
|
||||||
@@ -1077,7 +1085,8 @@ func (v *Validator) CheckVariables() {
|
|||||||
}
|
}
|
||||||
ctx := v.Schema.Context
|
ctx := v.Schema.Context
|
||||||
|
|
||||||
for _, info := range v.Tree.Variables {
|
checkNodeVars := func(node *index.ProjectNode) {
|
||||||
|
for _, info := range node.Variables {
|
||||||
def := info.Def
|
def := info.Def
|
||||||
|
|
||||||
// Compile Type
|
// Compile Type
|
||||||
@@ -1093,7 +1102,7 @@ func (v *Validator) CheckVariables() {
|
|||||||
}
|
}
|
||||||
|
|
||||||
if def.DefaultValue != nil {
|
if def.DefaultValue != nil {
|
||||||
valInterface := v.valueToInterface(def.DefaultValue)
|
valInterface := v.valueToInterface(def.DefaultValue, node)
|
||||||
valVal := ctx.Encode(valInterface)
|
valVal := ctx.Encode(valInterface)
|
||||||
|
|
||||||
// Unify
|
// Unify
|
||||||
@@ -1108,4 +1117,19 @@ func (v *Validator) CheckVariables() {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
v.Tree.Walk(checkNodeVars)
|
||||||
}
|
}
|
||||||
|
func (v *Validator) CheckUnresolvedVariables() {
|
||||||
|
for _, ref := range v.Tree.References {
|
||||||
|
if ref.IsVariable && ref.TargetVariable == nil {
|
||||||
|
v.Diagnostics = append(v.Diagnostics, Diagnostic{
|
||||||
|
Level: LevelError,
|
||||||
|
Message: fmt.Sprintf("Unresolved variable reference: '$%s'", ref.Name),
|
||||||
|
Position: ref.Position,
|
||||||
|
File: ref.File,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|||||||
90
test/lsp_app_test_repro_test.go
Normal file
90
test/lsp_app_test_repro_test.go
Normal file
@@ -0,0 +1,90 @@
|
|||||||
|
package integration
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
|
"github.com/marte-community/marte-dev-tools/internal/lsp"
|
||||||
|
"github.com/marte-community/marte-dev-tools/internal/schema"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestLSPAppTestRepro(t *testing.T) {
|
||||||
|
lsp.Tree = index.NewProjectTree()
|
||||||
|
lsp.Documents = make(map[string]string)
|
||||||
|
lsp.GlobalSchema = schema.LoadFullSchema(".")
|
||||||
|
|
||||||
|
var buf bytes.Buffer
|
||||||
|
lsp.Output = &buf
|
||||||
|
|
||||||
|
content := `+App = {
|
||||||
|
Class = RealTimeApplication
|
||||||
|
+Data = {
|
||||||
|
Class = ReferenceContainer
|
||||||
|
DefaultDataSource = DDB
|
||||||
|
+DDB = {
|
||||||
|
Class = GAMDataSource
|
||||||
|
}
|
||||||
|
+TimingDataSource = {
|
||||||
|
Class = TimingDataSource
|
||||||
|
}
|
||||||
|
}
|
||||||
|
+Functions = {
|
||||||
|
Class = ReferenceContainer
|
||||||
|
+FnA = {
|
||||||
|
Class = IOGAM
|
||||||
|
InputSignals = {
|
||||||
|
A = {
|
||||||
|
DataSource = DDB
|
||||||
|
Type = uint32
|
||||||
|
Value = $Value
|
||||||
|
}
|
||||||
|
}
|
||||||
|
OutputSignals = {
|
||||||
|
B = {
|
||||||
|
DataSource = DDB
|
||||||
|
Type = uint32
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
+States = {
|
||||||
|
Class = ReferenceContainer
|
||||||
|
+State = {
|
||||||
|
Class = RealTimeState
|
||||||
|
Threads = {
|
||||||
|
+Th1 = {
|
||||||
|
Class = RealTimeThread
|
||||||
|
Functions = { FnA }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
+Scheduler = {
|
||||||
|
Class = GAMScheduler
|
||||||
|
TimingDataSource = TimingDataSource
|
||||||
|
}
|
||||||
|
}
|
||||||
|
`
|
||||||
|
uri := "file://examples/app_test.marte"
|
||||||
|
lsp.HandleDidOpen(lsp.DidOpenTextDocumentParams{
|
||||||
|
TextDocument: lsp.TextDocumentItem{URI: uri, Text: content},
|
||||||
|
})
|
||||||
|
|
||||||
|
output := buf.String()
|
||||||
|
|
||||||
|
// Check Unresolved Variable
|
||||||
|
if !strings.Contains(output, "Unresolved variable reference: '$Value'") {
|
||||||
|
t.Error("LSP missing unresolved variable error")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check INOUT consumed but not produced
|
||||||
|
if !strings.Contains(output, "consumed by GAM '+FnA'") {
|
||||||
|
t.Error("LSP missing consumed but not produced error")
|
||||||
|
}
|
||||||
|
|
||||||
|
if t.Failed() {
|
||||||
|
t.Log(output)
|
||||||
|
}
|
||||||
|
}
|
||||||
167
test/lsp_binary_test.go
Normal file
167
test/lsp_binary_test.go
Normal file
@@ -0,0 +1,167 @@
|
|||||||
|
package integration
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bufio"
|
||||||
|
"encoding/json"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"os"
|
||||||
|
"os/exec"
|
||||||
|
"path/filepath"
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
"time"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestLSPBinaryDiagnostics(t *testing.T) {
|
||||||
|
// 1. Build mdt
|
||||||
|
// Ensure we are in test directory context
|
||||||
|
buildCmd := exec.Command("go", "build", "-o", "../build/mdt", "../cmd/mdt")
|
||||||
|
if output, err := buildCmd.CombinedOutput(); err != nil {
|
||||||
|
t.Fatalf("Failed to build mdt: %v\nOutput: %s", err, output)
|
||||||
|
}
|
||||||
|
|
||||||
|
// 2. Start mdt lsp
|
||||||
|
cmd := exec.Command("../build/mdt", "lsp")
|
||||||
|
stdin, _ := cmd.StdinPipe()
|
||||||
|
stdout, _ := cmd.StdoutPipe()
|
||||||
|
stderr, _ := cmd.StderrPipe()
|
||||||
|
|
||||||
|
// Pipe stderr to test log for debugging
|
||||||
|
go func() {
|
||||||
|
scanner := bufio.NewScanner(stderr)
|
||||||
|
for scanner.Scan() {
|
||||||
|
t.Logf("LSP STDERR: %s", scanner.Text())
|
||||||
|
}
|
||||||
|
}()
|
||||||
|
|
||||||
|
if err := cmd.Start(); err != nil {
|
||||||
|
t.Fatalf("Failed to start mdt lsp: %v", err)
|
||||||
|
}
|
||||||
|
defer func() {
|
||||||
|
cmd.Process.Kill()
|
||||||
|
cmd.Wait()
|
||||||
|
}()
|
||||||
|
|
||||||
|
reader := bufio.NewReader(stdout)
|
||||||
|
|
||||||
|
send := func(m interface{}) {
|
||||||
|
body, _ := json.Marshal(m)
|
||||||
|
msg := fmt.Sprintf("Content-Length: %d\r\n\r\n%s", len(body), body)
|
||||||
|
stdin.Write([]byte(msg))
|
||||||
|
}
|
||||||
|
|
||||||
|
readCh := make(chan map[string]interface{}, 100)
|
||||||
|
|
||||||
|
go func() { for {
|
||||||
|
// Parse Header
|
||||||
|
line, err := reader.ReadString('\n')
|
||||||
|
if err != nil {
|
||||||
|
close(readCh)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
var length int
|
||||||
|
// Handle Content-Length: <len>\r\n
|
||||||
|
if _, err := fmt.Sscanf(strings.TrimSpace(line), "Content-Length: %d", &length); err != nil {
|
||||||
|
// Maybe empty line or other header?
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Read until empty line (\r\n)
|
||||||
|
for {
|
||||||
|
l, err := reader.ReadString('\n')
|
||||||
|
if err != nil {
|
||||||
|
close(readCh)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if l == "\r\n" {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
body := make([]byte, length)
|
||||||
|
if _, err := io.ReadFull(reader, body); err != nil {
|
||||||
|
close(readCh)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
var m map[string]interface{}
|
||||||
|
if err := json.Unmarshal(body, &m); err == nil {
|
||||||
|
readCh <- m
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}()
|
||||||
|
|
||||||
|
cwd, _ := os.Getwd()
|
||||||
|
projectRoot := filepath.Dir(cwd)
|
||||||
|
absPath := filepath.Join(projectRoot, "examples/app_test.marte")
|
||||||
|
uri := "file://" + absPath
|
||||||
|
|
||||||
|
// 3. Initialize
|
||||||
|
examplesDir := filepath.Join(projectRoot, "examples")
|
||||||
|
send(map[string]interface{}{
|
||||||
|
"jsonrpc": "2.0",
|
||||||
|
"id": 1,
|
||||||
|
"method": "initialize",
|
||||||
|
"params": map[string]interface{}{
|
||||||
|
"rootUri": "file://" + examplesDir,
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
// 4. Open app_test.marte
|
||||||
|
content, err := os.ReadFile(absPath)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Failed to read test file: %v", err)
|
||||||
|
}
|
||||||
|
send(map[string]interface{}{
|
||||||
|
"jsonrpc": "2.0",
|
||||||
|
"method": "textDocument/didOpen",
|
||||||
|
"params": map[string]interface{}{
|
||||||
|
"textDocument": map[string]interface{}{
|
||||||
|
"uri": uri,
|
||||||
|
"languageId": "marte",
|
||||||
|
"version": 1,
|
||||||
|
"text": string(content),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
// 5. Wait for diagnostics
|
||||||
|
foundOrdering := false
|
||||||
|
foundVariable := false
|
||||||
|
|
||||||
|
timeout := time.After(30 * time.Second)
|
||||||
|
|
||||||
|
for {
|
||||||
|
select {
|
||||||
|
case msg, ok := <-readCh:
|
||||||
|
if !ok {
|
||||||
|
t.Fatal("LSP stream closed unexpectedly")
|
||||||
|
}
|
||||||
|
t.Logf("Received: %v", msg)
|
||||||
|
if method, ok := msg["method"].(string); ok && method == "textDocument/publishDiagnostics" {
|
||||||
|
params := msg["params"].(map[string]interface{})
|
||||||
|
// Check URI match?
|
||||||
|
// if params["uri"] != uri { continue } // Might be absolute vs relative
|
||||||
|
|
||||||
|
diags := params["diagnostics"].([]interface{})
|
||||||
|
for _, d := range diags {
|
||||||
|
m := d.(map[string]interface{})["message"].(string)
|
||||||
|
if strings.Contains(m, "INOUT Signal 'A'") {
|
||||||
|
foundOrdering = true
|
||||||
|
t.Log("Found Ordering error")
|
||||||
|
}
|
||||||
|
if strings.Contains(m, "Unresolved variable reference: '$Value'") {
|
||||||
|
foundVariable = true
|
||||||
|
t.Log("Found Variable error")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if foundOrdering && foundVariable {
|
||||||
|
return // Success
|
||||||
|
}
|
||||||
|
}
|
||||||
|
case <-timeout:
|
||||||
|
t.Fatal("Timeout waiting for diagnostics")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
161
test/lsp_diagnostics_app_test.go
Normal file
161
test/lsp_diagnostics_app_test.go
Normal file
@@ -0,0 +1,161 @@
|
|||||||
|
package integration
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"github.com/marte-community/marte-dev-tools/internal/index"
|
||||||
|
"github.com/marte-community/marte-dev-tools/internal/lsp"
|
||||||
|
"github.com/marte-community/marte-dev-tools/internal/schema"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestLSPDiagnosticsAppTest(t *testing.T) {
|
||||||
|
// Setup LSP environment
|
||||||
|
lsp.Tree = index.NewProjectTree()
|
||||||
|
lsp.Documents = make(map[string]string)
|
||||||
|
lsp.GlobalSchema = schema.LoadFullSchema(".") // Use default schema
|
||||||
|
|
||||||
|
// Capture output
|
||||||
|
var buf bytes.Buffer
|
||||||
|
lsp.Output = &buf
|
||||||
|
|
||||||
|
// Content from examples/app_test.marte (implicit signals, unresolved var, ordering error)
|
||||||
|
content := `+App = {
|
||||||
|
Class = RealTimeApplication
|
||||||
|
+Data = {
|
||||||
|
Class = ReferenceContainer
|
||||||
|
DefaultDataSource = DDB
|
||||||
|
+DDB = {
|
||||||
|
Class = GAMDataSource
|
||||||
|
}
|
||||||
|
+TimingDataSource = {
|
||||||
|
Class = TimingDataSource
|
||||||
|
}
|
||||||
|
}
|
||||||
|
+Functions = {
|
||||||
|
Class = ReferenceContainer
|
||||||
|
+FnA = {
|
||||||
|
Class = IOGAM
|
||||||
|
InputSignals = {
|
||||||
|
A = {
|
||||||
|
DataSource = DDB
|
||||||
|
Type = uint32
|
||||||
|
Value = $Value
|
||||||
|
}
|
||||||
|
}
|
||||||
|
OutputSignals = {
|
||||||
|
B = {
|
||||||
|
DataSource = DDB
|
||||||
|
Type = uint32
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
+States = {
|
||||||
|
Class = ReferenceContainer
|
||||||
|
+State = {
|
||||||
|
Class = RealTimeState
|
||||||
|
Threads = {
|
||||||
|
+Th1 = {
|
||||||
|
Class = RealTimeThread
|
||||||
|
Functions = { FnA }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
+Scheduler = {
|
||||||
|
Class = GAMScheduler
|
||||||
|
TimingDataSource = TimingDataSource
|
||||||
|
}
|
||||||
|
}
|
||||||
|
`
|
||||||
|
uri := "file://app_test.marte"
|
||||||
|
|
||||||
|
// Simulate DidOpen
|
||||||
|
lsp.HandleDidOpen(lsp.DidOpenTextDocumentParams{
|
||||||
|
TextDocument: lsp.TextDocumentItem{
|
||||||
|
URI: uri,
|
||||||
|
Text: content,
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
output := buf.String()
|
||||||
|
|
||||||
|
// Verify Diagnostics are published
|
||||||
|
if !strings.Contains(output, "textDocument/publishDiagnostics") {
|
||||||
|
t.Fatal("LSP did not publish diagnostics")
|
||||||
|
}
|
||||||
|
|
||||||
|
// 1. Check Unresolved Variable Error ($Value)
|
||||||
|
if !strings.Contains(output, "Unresolved variable reference: '$Value'") {
|
||||||
|
t.Error("Missing diagnostic for unresolved variable '$Value'")
|
||||||
|
}
|
||||||
|
|
||||||
|
// 2. Check INOUT Ordering Error (Signal A consumed but not produced)
|
||||||
|
// Message format: INOUT Signal 'A' (DS '+DDB') is consumed by GAM '+FnA' ... before being produced ...
|
||||||
|
if !strings.Contains(output, "INOUT Signal 'A'") || !strings.Contains(output, "before being produced") {
|
||||||
|
t.Error("Missing diagnostic for INOUT ordering error (Signal A)")
|
||||||
|
}
|
||||||
|
|
||||||
|
// 3. Check INOUT Unused Warning (Signal B produced but not consumed)
|
||||||
|
// Message format: INOUT Signal 'B' ... produced ... but never consumed ...
|
||||||
|
if !strings.Contains(output, "INOUT Signal 'B'") || !strings.Contains(output, "never consumed") {
|
||||||
|
t.Error("Missing diagnostic for unused INOUT signal (Signal B)")
|
||||||
|
}
|
||||||
|
|
||||||
|
// 4. Check Implicit Signal Warnings (A and B)
|
||||||
|
if !strings.Contains(output, "Implicitly Defined Signal: 'A'") {
|
||||||
|
t.Error("Missing diagnostic for implicit signal 'A'")
|
||||||
|
}
|
||||||
|
if !strings.Contains(output, "Implicitly Defined Signal: 'B'") {
|
||||||
|
t.Error("Missing diagnostic for implicit signal 'B'")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check Unused GAM Warning (FnA is used in Th1, so should NOT be unused)
|
||||||
|
// Wait, is FnA used?
|
||||||
|
// Functions = { FnA }.
|
||||||
|
// resolveScopedName should find it?
|
||||||
|
// In previous analysis, FnA inside Functions container might be hard to find from State?
|
||||||
|
// But TestLSPAppTestRepro passed?
|
||||||
|
// If FindNode finds it (Validator uses FindNode), then it is referenced.
|
||||||
|
// CheckUnused uses `v.Tree.References`.
|
||||||
|
// `ResolveReferences` populates references.
|
||||||
|
// `ResolveReferences` uses `resolveScopedName`.
|
||||||
|
// If `resolveScopedName` fails to find FnA from Th1 (because FnA is in Functions and not sibling/ancestor),
|
||||||
|
// Then `ref.Target` is nil.
|
||||||
|
// So `FnA` is NOT referenced in Index.
|
||||||
|
// So `CheckUnused` reports "Unused GAM".
|
||||||
|
|
||||||
|
// BUT Validator uses `resolveReference` (FindNode) to verify Functions array.
|
||||||
|
// So Validator knows it is valid.
|
||||||
|
// But `CheckUnused` relies on Index References.
|
||||||
|
|
||||||
|
// If Index doesn't resolve it, `CheckUnused` warns.
|
||||||
|
// Does output contain "Unused GAM: +FnA"?
|
||||||
|
// If so, `resolveScopedName` failed.
|
||||||
|
// Let's check output if test fails or just check existence.
|
||||||
|
if strings.Contains(output, "Unused GAM: +FnA") {
|
||||||
|
// This indicates scoping limitation or intended behavior if path is not full.
|
||||||
|
// "Ref = FnA" vs "Ref = Functions.FnA".
|
||||||
|
// MARTe scoping usually allows global search?
|
||||||
|
// I added fallback to Root search in resolveScopedName.
|
||||||
|
// FnA is child of Functions. Functions is child of App.
|
||||||
|
// Root children: App.
|
||||||
|
// App children: Functions.
|
||||||
|
// Functions children: FnA.
|
||||||
|
// Fallback checks `pt.Root.Children[name]`.
|
||||||
|
// Name is "FnA".
|
||||||
|
// Root children has "App". No "FnA".
|
||||||
|
// So fallback fails.
|
||||||
|
// So Index fails to resolve "FnA".
|
||||||
|
// So "Unused GAM" warning IS expected given current Index logic.
|
||||||
|
// I will NOT assert it is missing, unless I fix Index to search deep global (FindNode) as fallback?
|
||||||
|
// Validator uses FindNode (Deep).
|
||||||
|
// Index uses Scoped + Root Top Level.
|
||||||
|
// If I want Index to match Validator, I should use FindNode as final fallback?
|
||||||
|
// But that defeats scoping strictness.
|
||||||
|
// Ideally `app_test.marte` should use `Functions.FnA` or `App.Functions.FnA`.
|
||||||
|
// But for this test, I just check the requested diagnostics.
|
||||||
|
}
|
||||||
|
}
|
||||||
58
test/operators_test.go
Normal file
58
test/operators_test.go
Normal file
@@ -0,0 +1,58 @@
|
|||||||
|
package integration
|
||||||
|
|
||||||
|
import (
|
||||||
|
"os"
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"github.com/marte-community/marte-dev-tools/internal/builder"
|
||||||
|
"github.com/marte-community/marte-dev-tools/internal/parser"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestOperators(t *testing.T) {
|
||||||
|
content := `
|
||||||
|
#var A: int = 10
|
||||||
|
#var B: int = 20
|
||||||
|
#var S1: string = "Hello"
|
||||||
|
#var S2: string = "World"
|
||||||
|
|
||||||
|
+Obj = {
|
||||||
|
Math = $A + $B
|
||||||
|
Precedence = $A + $B * 2
|
||||||
|
Concat = $S1 .. " " .. $S2
|
||||||
|
}
|
||||||
|
`
|
||||||
|
// Check Parser
|
||||||
|
p := parser.NewParser(content)
|
||||||
|
_, err := p.Parse()
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Parse failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check Builder Output
|
||||||
|
f, _ := os.CreateTemp("", "ops.marte")
|
||||||
|
f.WriteString(content)
|
||||||
|
f.Close()
|
||||||
|
defer os.Remove(f.Name())
|
||||||
|
|
||||||
|
b := builder.NewBuilder([]string{f.Name()}, nil)
|
||||||
|
|
||||||
|
outF, _ := os.CreateTemp("", "out.marte")
|
||||||
|
defer os.Remove(outF.Name())
|
||||||
|
b.Build(outF)
|
||||||
|
outF.Close()
|
||||||
|
|
||||||
|
outContent, _ := os.ReadFile(outF.Name())
|
||||||
|
outStr := string(outContent)
|
||||||
|
|
||||||
|
if !strings.Contains(outStr, "Math = 30") {
|
||||||
|
t.Errorf("Math failed. Got:\n%s", outStr)
|
||||||
|
}
|
||||||
|
// 10 + 20 * 2 = 50
|
||||||
|
if !strings.Contains(outStr, "Precedence = 50") {
|
||||||
|
t.Errorf("Precedence failed. Got:\n%s", outStr)
|
||||||
|
}
|
||||||
|
if !strings.Contains(outStr, "Concat = \"Hello World\"") {
|
||||||
|
t.Errorf("Concat failed. Got:\n%s", outStr)
|
||||||
|
}
|
||||||
|
}
|
||||||
Reference in New Issue
Block a user