| Commit message (Collapse) | Author | Age | Files | Lines |
... | |
|\
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
1549: Show type lenses for the resolved let bindings r=matklad a=SomeoneToIgnore
Types that are fully unresolved are not displayed:
<img width="279" alt="image" src="https://user-images.githubusercontent.com/2690773/61518122-8e4ba980-aa11-11e9-9249-6d9f9b202e6a.png">
A few concerns that I have about the current implementation:
* I've adjusted the `file_structure` API method to return the information about the `let` bindings.
Although it works fine, I have a feeling that adding a new API method would be the better way.
But this requires some prior discussion, so I've decided to go for an easy way with an MVP.
Would be nice to hear your suggestions.
* There's a hardcoded `{undersolved}` check that I was forced to use, since the method that resolves types returns a `String`.
Is there a better typed API I can use? This will help, for instance, to add an action to the type lenses that will allow us to navigate to the type.
Co-authored-by: Kirill Bulatov <[email protected]>
|
| | |
|
| | |
|
| | |
|
| | |
|
| | |
|
|/ |
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This appears to have been introduced ages ago in
https://github.com/rust-analyzer/rust-analyzer/commit/be742a587704f27f4e503c50f549aa9ec1527fcc
but has since been removed.
As it stands, it is problematic if multiple instances of the
rust-analyzer LSP are launched during the same VS Code session because
VS Code complains about multiple LSP servers trying to register the
same command.
Most LSP servers workaround this by parameterizing the command by the
process id. For example, this is where `rls` does this:
https://github.com/rust-lang/rls/blob/ff0b9057c8f62bc4f8113d741e96c9587ef1a817/rls/src/server/mod.rs#L413-L421
Though `apply_code_action` does not seems to be used, so it seems better
to delete it than to parameterize it.
|
| |
|
| |
|
|
|
|
| |
closes #1474
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
|
|
|
|
|
| |
This wasn't a right decision in the first place, the feature flag was
broken in the last rustfmt release, and syntax highlighting of imports
is more important anyway
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
My workflow in Visual Studio Code + Rust Analyzer has become:
1. Make a change to Rust source code using all the analysis magic
2. Save the file to trigger `cargo watch`. I have format on save enabled
for all file types so this also runs `rustfmt`
3. Fix any diagnostics that `cargo watch` finds
Unfortunately if the Rust source has any syntax errors the act of saving
will pop up a scary "command has failed" message and will switch to the
"Output" tab to show the `rustfmt` error and exit code.
I did a quick survey of what other Language Servers do in this case.
Both the JSON and TypeScript servers will swallow the error and return
success. This is consistent with how I remember my workflow in those
languages. The syntax error will show up as a diagnostic so it should
be clear why the file isn't formatting.
I checked the `rustfmt` source code and while it does distinguish "parse
errors" from "operational errors" internally they both result in exit
status of 1. However, more catastrophic errors (missing `rustfmt`,
SIGSEGV, etc) will return 127+ error codes which we can distinguish from
a normal failure.
This changes our handler to log an info message and feign success if
`rustfmt` exits with status 1.
Another option I considered was only swallowing the error if the
formatting request came from format-on-save. However, the Language
Server Protocol doesn't seem to distinguish those cases.
|
| |
|
| |
|
| |
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
| |
The issue was windows specific -- cancellation caused collection of
bracktraces at some point, and that was slow on windows.
The proper fix here is to make sure that we don't collect bracktraces
unnecessary (which we currently do due to failure), but, as a
temporary fix, let's just not force their collection in the first
place!
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|