Compare commits
4 Commits
ef7729475a
...
77fe3e9cac
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
77fe3e9cac | ||
|
|
0ee44c0a27 | ||
|
|
d450d358b4 | ||
|
|
2cdcfe2812 |
21
LICENSE
Normal file
21
LICENSE
Normal file
@@ -0,0 +1,21 @@
|
|||||||
|
MIT License
|
||||||
|
|
||||||
|
Copyright (c) 2026 MARTe Community
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
of this software and associated documentation files (the "Software"), to deal
|
||||||
|
in the Software without restriction, including without limitation the rights
|
||||||
|
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
copies of the Software, and to permit persons to whom the Software is
|
||||||
|
furnished to do so, subject to the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be included in all
|
||||||
|
copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||||
|
SOFTWARE.
|
||||||
96
README.md
Normal file
96
README.md
Normal file
@@ -0,0 +1,96 @@
|
|||||||
|
# MARTe Development Tools (mdt)
|
||||||
|
|
||||||
|
`mdt` is a comprehensive toolkit for developing, validating, and building configurations for the MARTe real-time framework. It provides a CLI and a Language Server Protocol (LSP) server to enhance the development experience.
|
||||||
|
|
||||||
|
## Features
|
||||||
|
|
||||||
|
- **LSP Server**: Real-time syntax checking, validation, autocomplete, hover documentation, and navigation (Go to Definition/References).
|
||||||
|
- **Builder**: Merges multiple configuration files into a single, ordered output file.
|
||||||
|
- **Formatter**: Standardizes configuration file formatting.
|
||||||
|
- **Validator**: Advanced semantic validation using [CUE](https://cuelang.org/) schemas, ensuring type safety and structural correctness.
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
|
||||||
|
### From Source
|
||||||
|
|
||||||
|
Requirements: Go 1.21+
|
||||||
|
|
||||||
|
```bash
|
||||||
|
go install github.com/marte-community/marte-dev-tools/cmd/mdt@latest
|
||||||
|
```
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
### CLI Commands
|
||||||
|
|
||||||
|
- **Check**: Run validation on a file or project.
|
||||||
|
```bash
|
||||||
|
mdt check path/to/project
|
||||||
|
```
|
||||||
|
- **Build**: Merge project files into a single output.
|
||||||
|
```bash
|
||||||
|
mdt build -o output.marte main.marte
|
||||||
|
```
|
||||||
|
- **Format**: Format configuration files.
|
||||||
|
```bash
|
||||||
|
mdt fmt path/to/file.marte
|
||||||
|
```
|
||||||
|
- **LSP**: Start the language server (used by editor plugins).
|
||||||
|
```bash
|
||||||
|
mdt lsp
|
||||||
|
```
|
||||||
|
|
||||||
|
### Editor Integration
|
||||||
|
|
||||||
|
`mdt lsp` implements the Language Server Protocol. You can use it with any LSP-compatible editor (VS Code, Neovim, Emacs, etc.).
|
||||||
|
|
||||||
|
## MARTe Configuration
|
||||||
|
|
||||||
|
The tools support the MARTe configuration format with extended features:
|
||||||
|
- **Objects**: `+Node = { Class = ... }`
|
||||||
|
- **Signals**: `Signal = { Type = ... }`
|
||||||
|
- **Namespaces**: `#package PROJECT.NODE` for organizing multi-file projects.
|
||||||
|
|
||||||
|
### Validation & Schema
|
||||||
|
|
||||||
|
Validation is fully schema-driven using CUE.
|
||||||
|
|
||||||
|
- **Built-in Schema**: Covers standard MARTe classes (`StateMachine`, `GAM`, `DataSource`, `RealTimeApplication`, etc.).
|
||||||
|
- **Custom Schema**: Add a `.marte_schema.cue` file to your project root to extend or override definitions.
|
||||||
|
|
||||||
|
**Example `.marte_schema.cue`:**
|
||||||
|
```cue
|
||||||
|
package schema
|
||||||
|
|
||||||
|
#Classes: {
|
||||||
|
MyCustomGAM: {
|
||||||
|
Param1: int
|
||||||
|
Param2?: string
|
||||||
|
...
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Pragmas (Suppressing Warnings)
|
||||||
|
|
||||||
|
Use comments starting with `//!` to control validation behavior:
|
||||||
|
|
||||||
|
- `//!unused: Reason` - Suppress "Unused GAM" or "Unused Signal" warnings.
|
||||||
|
- `//!implicit: Reason` - Suppress "Implicitly Defined Signal" warnings.
|
||||||
|
- `//!cast(DefinedType, UsageType)` - Allow type mismatch between definition and usage (e.g. `//!cast(uint32, int32)`).
|
||||||
|
- `//!allow(unused)` - Global suppression for the file.
|
||||||
|
|
||||||
|
## Development
|
||||||
|
|
||||||
|
### Building
|
||||||
|
```bash
|
||||||
|
go build ./cmd/mdt
|
||||||
|
```
|
||||||
|
|
||||||
|
### Running Tests
|
||||||
|
```bash
|
||||||
|
go test ./...
|
||||||
|
```
|
||||||
|
|
||||||
|
## License
|
||||||
|
MIT
|
||||||
@@ -311,13 +311,18 @@ func handleDidOpen(params DidOpenTextDocumentParams) {
|
|||||||
documents[params.TextDocument.URI] = params.TextDocument.Text
|
documents[params.TextDocument.URI] = params.TextDocument.Text
|
||||||
p := parser.NewParser(params.TextDocument.Text)
|
p := parser.NewParser(params.TextDocument.Text)
|
||||||
config, err := p.Parse()
|
config, err := p.Parse()
|
||||||
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
publishParserError(params.TextDocument.URI, err)
|
publishParserError(params.TextDocument.URI, err)
|
||||||
return
|
} else {
|
||||||
|
publishParserError(params.TextDocument.URI, nil)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if config != nil {
|
||||||
tree.AddFile(path, config)
|
tree.AddFile(path, config)
|
||||||
tree.ResolveReferences()
|
tree.ResolveReferences()
|
||||||
runValidation(params.TextDocument.URI)
|
runValidation(params.TextDocument.URI)
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func handleDidChange(params DidChangeTextDocumentParams) {
|
func handleDidChange(params DidChangeTextDocumentParams) {
|
||||||
@@ -329,13 +334,18 @@ func handleDidChange(params DidChangeTextDocumentParams) {
|
|||||||
path := uriToPath(params.TextDocument.URI)
|
path := uriToPath(params.TextDocument.URI)
|
||||||
p := parser.NewParser(text)
|
p := parser.NewParser(text)
|
||||||
config, err := p.Parse()
|
config, err := p.Parse()
|
||||||
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
publishParserError(params.TextDocument.URI, err)
|
publishParserError(params.TextDocument.URI, err)
|
||||||
return
|
} else {
|
||||||
|
publishParserError(params.TextDocument.URI, nil)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if config != nil {
|
||||||
tree.AddFile(path, config)
|
tree.AddFile(path, config)
|
||||||
tree.ResolveReferences()
|
tree.ResolveReferences()
|
||||||
runValidation(params.TextDocument.URI)
|
runValidation(params.TextDocument.URI)
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func handleFormatting(params DocumentFormattingParams) []TextEdit {
|
func handleFormatting(params DocumentFormattingParams) []TextEdit {
|
||||||
@@ -426,6 +436,19 @@ func runValidation(uri string) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
func publishParserError(uri string, err error) {
|
func publishParserError(uri string, err error) {
|
||||||
|
if err == nil {
|
||||||
|
notification := JsonRpcMessage{
|
||||||
|
Jsonrpc: "2.0",
|
||||||
|
Method: "textDocument/publishDiagnostics",
|
||||||
|
Params: mustMarshal(PublishDiagnosticsParams{
|
||||||
|
URI: uri,
|
||||||
|
Diagnostics: []LSPDiagnostic{},
|
||||||
|
}),
|
||||||
|
}
|
||||||
|
send(notification)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
var line, col int
|
var line, col int
|
||||||
var msg string
|
var msg string
|
||||||
// Try parsing "line:col: message"
|
// Try parsing "line:col: message"
|
||||||
|
|||||||
@@ -11,6 +11,7 @@ type Parser struct {
|
|||||||
buf []Token
|
buf []Token
|
||||||
comments []Comment
|
comments []Comment
|
||||||
pragmas []Pragma
|
pragmas []Pragma
|
||||||
|
errors []error
|
||||||
}
|
}
|
||||||
|
|
||||||
func NewParser(input string) *Parser {
|
func NewParser(input string) *Parser {
|
||||||
@@ -19,6 +20,10 @@ func NewParser(input string) *Parser {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (p *Parser) addError(pos Position, msg string) {
|
||||||
|
p.errors = append(p.errors, fmt.Errorf("%d:%d: %s", pos.Line, pos.Column, msg))
|
||||||
|
}
|
||||||
|
|
||||||
func (p *Parser) next() Token {
|
func (p *Parser) next() Token {
|
||||||
if len(p.buf) > 0 {
|
if len(p.buf) > 0 {
|
||||||
t := p.buf[0]
|
t := p.buf[0]
|
||||||
@@ -71,72 +76,82 @@ func (p *Parser) Parse() (*Configuration, error) {
|
|||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
|
|
||||||
def, err := p.parseDefinition()
|
def, ok := p.parseDefinition()
|
||||||
if err != nil {
|
if ok {
|
||||||
return nil, err
|
|
||||||
}
|
|
||||||
config.Definitions = append(config.Definitions, def)
|
config.Definitions = append(config.Definitions, def)
|
||||||
|
} else {
|
||||||
|
// Synchronization: skip token if not consumed to make progress
|
||||||
|
if p.peek() == tok {
|
||||||
|
p.next()
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
config.Comments = p.comments
|
config.Comments = p.comments
|
||||||
config.Pragmas = p.pragmas
|
config.Pragmas = p.pragmas
|
||||||
return config, nil
|
|
||||||
|
var err error
|
||||||
|
if len(p.errors) > 0 {
|
||||||
|
err = p.errors[0]
|
||||||
|
}
|
||||||
|
return config, err
|
||||||
}
|
}
|
||||||
|
|
||||||
func (p *Parser) parseDefinition() (Definition, error) {
|
func (p *Parser) parseDefinition() (Definition, bool) {
|
||||||
tok := p.next()
|
tok := p.next()
|
||||||
switch tok.Type {
|
switch tok.Type {
|
||||||
case TokenIdentifier:
|
case TokenIdentifier:
|
||||||
// Could be Field = Value OR Node = { ... }
|
|
||||||
name := tok.Value
|
name := tok.Value
|
||||||
if p.next().Type != TokenEqual {
|
if p.peek().Type != TokenEqual {
|
||||||
return nil, fmt.Errorf("%d:%d: expected =", tok.Position.Line, tok.Position.Column)
|
p.addError(tok.Position, "expected =")
|
||||||
|
return nil, false
|
||||||
}
|
}
|
||||||
|
p.next() // Consume =
|
||||||
|
|
||||||
// Disambiguate based on RHS
|
|
||||||
nextTok := p.peek()
|
nextTok := p.peek()
|
||||||
if nextTok.Type == TokenLBrace {
|
if nextTok.Type == TokenLBrace {
|
||||||
// Check if it looks like a Subnode (contains definitions) or Array (contains values)
|
|
||||||
if p.isSubnodeLookahead() {
|
if p.isSubnodeLookahead() {
|
||||||
sub, err := p.parseSubnode()
|
sub, ok := p.parseSubnode()
|
||||||
if err != nil {
|
if !ok {
|
||||||
return nil, err
|
return nil, false
|
||||||
}
|
}
|
||||||
return &ObjectNode{
|
return &ObjectNode{
|
||||||
Position: tok.Position,
|
Position: tok.Position,
|
||||||
Name: name,
|
Name: name,
|
||||||
Subnode: sub,
|
Subnode: sub,
|
||||||
}, nil
|
}, true
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Default to Field
|
val, ok := p.parseValue()
|
||||||
val, err := p.parseValue()
|
if !ok {
|
||||||
if err != nil {
|
return nil, false
|
||||||
return nil, err
|
|
||||||
}
|
}
|
||||||
return &Field{
|
return &Field{
|
||||||
Position: tok.Position,
|
Position: tok.Position,
|
||||||
Name: name,
|
Name: name,
|
||||||
Value: val,
|
Value: val,
|
||||||
}, nil
|
}, true
|
||||||
|
|
||||||
case TokenObjectIdentifier:
|
case TokenObjectIdentifier:
|
||||||
// node = subnode
|
|
||||||
name := tok.Value
|
name := tok.Value
|
||||||
if p.next().Type != TokenEqual {
|
if p.peek().Type != TokenEqual {
|
||||||
return nil, fmt.Errorf("%d:%d: expected =", tok.Position.Line, tok.Position.Column)
|
p.addError(tok.Position, "expected =")
|
||||||
|
return nil, false
|
||||||
}
|
}
|
||||||
sub, err := p.parseSubnode()
|
p.next() // Consume =
|
||||||
if err != nil {
|
|
||||||
return nil, err
|
sub, ok := p.parseSubnode()
|
||||||
|
if !ok {
|
||||||
|
return nil, false
|
||||||
}
|
}
|
||||||
return &ObjectNode{
|
return &ObjectNode{
|
||||||
Position: tok.Position,
|
Position: tok.Position,
|
||||||
Name: name,
|
Name: name,
|
||||||
Subnode: sub,
|
Subnode: sub,
|
||||||
}, nil
|
}, true
|
||||||
default:
|
default:
|
||||||
return nil, fmt.Errorf("%d:%d: unexpected token %v", tok.Position.Line, tok.Position.Column, tok.Value)
|
p.addError(tok.Position, fmt.Sprintf("unexpected token %v", tok.Value))
|
||||||
|
return nil, false
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -176,10 +191,11 @@ func (p *Parser) isSubnodeLookahead() bool {
|
|||||||
return false
|
return false
|
||||||
}
|
}
|
||||||
|
|
||||||
func (p *Parser) parseSubnode() (Subnode, error) {
|
func (p *Parser) parseSubnode() (Subnode, bool) {
|
||||||
tok := p.next()
|
tok := p.next()
|
||||||
if tok.Type != TokenLBrace {
|
if tok.Type != TokenLBrace {
|
||||||
return Subnode{}, fmt.Errorf("%d:%d: expected {", tok.Position.Line, tok.Position.Column)
|
p.addError(tok.Position, "expected {")
|
||||||
|
return Subnode{}, false
|
||||||
}
|
}
|
||||||
sub := Subnode{Position: tok.Position}
|
sub := Subnode{Position: tok.Position}
|
||||||
for {
|
for {
|
||||||
@@ -190,18 +206,22 @@ func (p *Parser) parseSubnode() (Subnode, error) {
|
|||||||
break
|
break
|
||||||
}
|
}
|
||||||
if t.Type == TokenEOF {
|
if t.Type == TokenEOF {
|
||||||
return sub, fmt.Errorf("%d:%d: unexpected EOF, expected }", t.Position.Line, t.Position.Column)
|
p.addError(t.Position, "unexpected EOF, expected }")
|
||||||
}
|
return sub, false
|
||||||
def, err := p.parseDefinition()
|
|
||||||
if err != nil {
|
|
||||||
return sub, err
|
|
||||||
}
|
}
|
||||||
|
def, ok := p.parseDefinition()
|
||||||
|
if ok {
|
||||||
sub.Definitions = append(sub.Definitions, def)
|
sub.Definitions = append(sub.Definitions, def)
|
||||||
|
} else {
|
||||||
|
if p.peek() == t {
|
||||||
|
p.next()
|
||||||
}
|
}
|
||||||
return sub, nil
|
}
|
||||||
|
}
|
||||||
|
return sub, true
|
||||||
}
|
}
|
||||||
|
|
||||||
func (p *Parser) parseValue() (Value, error) {
|
func (p *Parser) parseValue() (Value, bool) {
|
||||||
tok := p.next()
|
tok := p.next()
|
||||||
switch tok.Type {
|
switch tok.Type {
|
||||||
case TokenString:
|
case TokenString:
|
||||||
@@ -209,24 +229,21 @@ func (p *Parser) parseValue() (Value, error) {
|
|||||||
Position: tok.Position,
|
Position: tok.Position,
|
||||||
Value: strings.Trim(tok.Value, "\""),
|
Value: strings.Trim(tok.Value, "\""),
|
||||||
Quoted: true,
|
Quoted: true,
|
||||||
}, nil
|
}, true
|
||||||
|
|
||||||
case TokenNumber:
|
case TokenNumber:
|
||||||
// Simplistic handling
|
|
||||||
if strings.Contains(tok.Value, ".") || strings.Contains(tok.Value, "e") {
|
if strings.Contains(tok.Value, ".") || strings.Contains(tok.Value, "e") {
|
||||||
f, _ := strconv.ParseFloat(tok.Value, 64)
|
f, _ := strconv.ParseFloat(tok.Value, 64)
|
||||||
return &FloatValue{Position: tok.Position, Value: f, Raw: tok.Value}, nil
|
return &FloatValue{Position: tok.Position, Value: f, Raw: tok.Value}, true
|
||||||
}
|
}
|
||||||
i, _ := strconv.ParseInt(tok.Value, 0, 64)
|
i, _ := strconv.ParseInt(tok.Value, 0, 64)
|
||||||
return &IntValue{Position: tok.Position, Value: i, Raw: tok.Value}, nil
|
return &IntValue{Position: tok.Position, Value: i, Raw: tok.Value}, true
|
||||||
case TokenBool:
|
case TokenBool:
|
||||||
return &BoolValue{Position: tok.Position, Value: tok.Value == "true"},
|
return &BoolValue{Position: tok.Position, Value: tok.Value == "true"},
|
||||||
nil
|
true
|
||||||
case TokenIdentifier:
|
case TokenIdentifier:
|
||||||
// reference?
|
return &ReferenceValue{Position: tok.Position, Value: tok.Value}, true
|
||||||
return &ReferenceValue{Position: tok.Position, Value: tok.Value}, nil
|
|
||||||
case TokenLBrace:
|
case TokenLBrace:
|
||||||
// array
|
|
||||||
arr := &ArrayValue{Position: tok.Position}
|
arr := &ArrayValue{Position: tok.Position}
|
||||||
for {
|
for {
|
||||||
t := p.peek()
|
t := p.peek()
|
||||||
@@ -239,14 +256,15 @@ func (p *Parser) parseValue() (Value, error) {
|
|||||||
p.next()
|
p.next()
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
val, err := p.parseValue()
|
val, ok := p.parseValue()
|
||||||
if err != nil {
|
if !ok {
|
||||||
return nil, err
|
return nil, false
|
||||||
}
|
}
|
||||||
arr.Elements = append(arr.Elements, val)
|
arr.Elements = append(arr.Elements, val)
|
||||||
}
|
}
|
||||||
return arr, nil
|
return arr, true
|
||||||
default:
|
default:
|
||||||
return nil, fmt.Errorf("%d:%d: unexpected value token %v", tok.Position.Line, tok.Position.Column, tok.Value)
|
p.addError(tok.Position, fmt.Sprintf("unexpected value token %v", tok.Value))
|
||||||
|
return nil, false
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -29,7 +29,12 @@ The LSP server should provide the following capabilities:
|
|||||||
- **Go to Definition**: Jump to the definition of a reference, supporting navigation across any file in the current project.
|
- **Go to Definition**: Jump to the definition of a reference, supporting navigation across any file in the current project.
|
||||||
- **Go to References**: Find usages of a node or field, supporting navigation across any file in the current project.
|
- **Go to References**: Find usages of a node or field, supporting navigation across any file in the current project.
|
||||||
- **Code Completion**: Autocomplete fields, values, and references.
|
- **Code Completion**: Autocomplete fields, values, and references.
|
||||||
- **Code Snippets**: Provide snippets for common patterns.
|
- **Context-Aware**: Suggestions depend on the cursor position (e.g., inside an object, assigning a value).
|
||||||
|
- **Schema-Driven**: Field suggestions are derived from the CUE schema for the current object's Class, indicating mandatory vs. optional fields.
|
||||||
|
- **Reference Suggestions**:
|
||||||
|
- `DataSource` fields suggest available DataSource objects.
|
||||||
|
- `Functions` (in Threads) suggest available GAM objects.
|
||||||
|
- **Code Snippets**: Provide snippets for common patterns (e.g., `+Object = { ... }`).
|
||||||
- **Formatting**: Format the document using the same rules and engine as the `fmt` command.
|
- **Formatting**: Format the document using the same rules and engine as the `fmt` command.
|
||||||
|
|
||||||
## Build System & File Structure
|
## Build System & File Structure
|
||||||
@@ -47,9 +52,9 @@ The LSP server should provide the following capabilities:
|
|||||||
- **Namespace Consistency**: The build tool must verify that all input files belong to the same project namespace (the first segment of the `#package` URI). If multiple project namespaces are detected, the build must fail with an error.
|
- **Namespace Consistency**: The build tool must verify that all input files belong to the same project namespace (the first segment of the `#package` URI). If multiple project namespaces are detected, the build must fail with an error.
|
||||||
- **Target**: The build output is written to a single target file (e.g., provided via CLI or API).
|
- **Target**: The build output is written to a single target file (e.g., provided via CLI or API).
|
||||||
- **Multi-File Definitions**: Nodes and objects can be defined across multiple files. The build tool, validator, and LSP must merge these definitions (including all fields and sub-nodes) from the entire project to create a unified view before processing or validating.
|
- **Multi-File Definitions**: Nodes and objects can be defined across multiple files. The build tool, validator, and LSP must merge these definitions (including all fields and sub-nodes) from the entire project to create a unified view before processing or validating.
|
||||||
- **Global References**: References to nodes, signals, or objects can point to definitions located in any file within the project.
|
- **Global References**: References to nodes, signals, or objects can point to definitions located in any file within the project. Support for dot-separated paths (e.g., `Node.SubNode`) is required.
|
||||||
- **Merging Order**: For objects defined across multiple files, the **first file** to be considered is the one containing the `Class` field definition.
|
- **Merging Order**: For objects defined across multiple files, definitions are merged. The build tool must preserve the relative order of fields and sub-nodes as they appear in the source files, interleaving them correctly in the final output.
|
||||||
- **Field Order**: Within a single file, the relative order of defined fields must be maintained.
|
- **Field Order**: Within a single file (and across merged files), the relative order of defined fields must be maintained in the output.
|
||||||
- The LSP indexes only files belonging to the same project/namespace scope.
|
- The LSP indexes only files belonging to the same project/namespace scope.
|
||||||
- **Output**: The output format is the same as the input configuration but without the `#package` macro.
|
- **Output**: The output format is the same as the input configuration but without the `#package` macro.
|
||||||
|
|
||||||
@@ -160,13 +165,13 @@ The tool must build an index of the configuration to support LSP features and va
|
|||||||
- **Field Order**: Verification that specific fields appear in a prescribed order when required by the class definition.
|
- **Field Order**: Verification that specific fields appear in a prescribed order when required by the class definition.
|
||||||
- **Conditional Fields**: Validation of fields whose presence or value depends on the values of other fields within the same node or context.
|
- **Conditional Fields**: Validation of fields whose presence or value depends on the values of other fields within the same node or context.
|
||||||
- **Schema Definition**:
|
- **Schema Definition**:
|
||||||
- Class validation rules must be defined in a separate schema file.
|
- Class validation rules must be defined in a separate schema file using the **CUE** language.
|
||||||
- **Project-Specific Classes**: Developers can define their own project-specific classes and corresponding validation rules, expanding the validation capabilities for their specific needs.
|
- **Project-Specific Classes**: Developers can define their own project-specific classes and corresponding validation rules, expanding the validation capabilities for their specific needs.
|
||||||
- **Schema Loading**:
|
- **Schema Loading**:
|
||||||
- **Default Schema**: The tool should look for a default schema file `marte_schema.json` in standard system locations:
|
- **Default Schema**: The tool should look for a default schema file `marte_schema.cue` in standard system locations:
|
||||||
- `/usr/share/mdt/marte_schema.json`
|
- `/usr/share/mdt/marte_schema.cue`
|
||||||
- `$HOME/.local/share/mdt/marte_schema.json`
|
- `$HOME/.local/share/mdt/marte_schema.cue`
|
||||||
- **Project Schema**: If a file named `.marte_schema.json` exists in the project root, it must be loaded.
|
- **Project Schema**: If a file named `.marte_schema.cue` exists in the project root, it must be loaded.
|
||||||
- **Merging**: The final schema is a merge of the built-in schema, the system default schema (if found), and the project-specific schema. Rules in later sources (Project > System > Built-in) append to or override earlier ones.
|
- **Merging**: The final schema is a merge of the built-in schema, the system default schema (if found), and the project-specific schema. Rules in later sources (Project > System > Built-in) append to or override earlier ones.
|
||||||
- **Duplicate Fields**:
|
- **Duplicate Fields**:
|
||||||
- **Constraint**: A field must not be defined more than once within the same object/node scope, even if those definitions are spread across different files.
|
- **Constraint**: A field must not be defined more than once within the same object/node scope, even if those definitions are spread across different files.
|
||||||
|
|||||||
Reference in New Issue
Block a user