Add support for processing language server protocol messages#10025
Add support for processing language server protocol messages#10025yegappan wants to merge 1 commit intovim:masterfrom
Conversation
Codecov Report
@@ Coverage Diff @@
## master #10025 +/- ##
==========================================
+ Coverage 81.91% 81.93% +0.01%
==========================================
Files 167 167
Lines 187325 187385 +60
Branches 42236 42254 +18
==========================================
+ Hits 153448 153528 +80
+ Misses 21531 21507 -24
- Partials 12346 12350 +4
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
9b910d6 to
7b23ee7
Compare
src/channel.c
Outdated
| // We find the end once, to avoid calling strlen() many times. | ||
| reader->js_end = reader->js_buf + STRLEN(reader->js_buf); | ||
|
|
||
| // skip the HTTP header |
There was a problem hiding this comment.
The logic here looks a bit fragile as it assumes a fixed order of header fields (that's irrelevant according to RFC7320 3.2.2) and a fixed number of them (the LSP specification only states there's at least one entry "and that at least one header is mandatory").
A better solution would be to iterate line by line (terminated by \r\n) until an empty one is found and parsing/validating each header entry that's found.
Some validation of the parsed data is also needed as:
- The
Content-Lengthmust match the effective payload length according to the HTTP spec and it can be easily used to check if the given request has been fully received, delaying the parsing otherwise.
When a Content-Length is given in a message where a message-body is allowed, its field value MUST exactly match the number of OCTETs in the message-body. HTTP/1.1 user agents MUST notify the user when an invalid length is received and detected.
Content-Typemust containutf8orutf-8to make the code more robust wrt encoding changes.
41b4acb to
47ae5e5
Compare
|
Hi,
On Mon, Mar 28, 2022 at 1:48 AM LemonBoy ***@***.***> wrote:
***@***.**** commented on this pull request.
------------------------------
In src/channel.c
<#10025 (comment)>:
> + * It has the following two fields:
+ *
+ * Content-Length: ...
+ * Content-Type: application/vscode-jsonrpc; charset=utf-8
+ *
+ * Each field ends with "\r\n". The header ends with an additional "\r\n".
+ */
+ static int
+channel_skip_lsp_http_hdr(js_read_T *reader)
+{
+ char_u *p;
+
+ // We find the end once, to avoid calling strlen() many times.
+ reader->js_end = reader->js_buf + STRLEN(reader->js_buf);
+
+ // skip the HTTP header
The logic here looks a bit fragile as it assumes a fixed order of header
fields (that's irrelevant according to RFC7320 3.2.2) and a fixed number of
them (the LSP specification only states there's at least one entry "and
that at least one header is mandatory").
A better solution would be to iterate line by line (terminated by \r\n)
until an empty one is found and parsing/validating each header entry that's
found.
Some validation of the parsed data is also needed as:
- The Content-Length *must* match the effective payload length
according to the HTTP spec and it can be easily used to check if the given
request has been fully received, delaying the parsing otherwise.
When a Content-Length is given in a message where a message-body is allowed, its field value MUST exactly match the number of OCTETs in the message-body. HTTP/1.1 user agents MUST notify the user when an invalid length is received and detected.
Thanks for the feedback. I have updated the PR to process the HTTP header
correctly
and check for these conditions. Let me know if any other cases need to be
handled.
Regards,
Yegappan
…
- Content-Type must contain utf8 or utf-8 to make the code more robust
wrt encoding changes.
|
src/channel.c
Outdated
| // process the content length field (if present) | ||
| if ((p - line_start > 16) | ||
| && STRNICMP(line_start, "Content-Length: ", 16) == 0) | ||
| payload_len = atoi((char *)line_start + 16); |
There was a problem hiding this comment.
This line is quite dangerous, a negative content length can and will wreak havock.
The use of atoi is also problematic, in case of overflow you may be invoking undefined behaviour, plus the error checking is quite lacking. This call can be replaced with the use of strtol (and proper error checking, don't forget the check for ERANGE) and a check to make sure the value is not negative.
Using unsigned variables for everything that represents a length is also a good idea.
src/channel.c
Outdated
| && STRNICMP(line_start, "Content-Length: ", 16) == 0) | ||
| payload_len = atoi((char *)line_start + 16); | ||
|
|
||
| if ((line_start + 2) == p && line_start[0] == '\r' && |
There was a problem hiding this comment.
Minor nit, make this consistent with the check above.
| if ((line_start + 2) == p && line_start[0] == '\r' && | |
| if ((p - line_start) == 2 && line_start[0] == '\r' && |
| hdr_len = p - reader->js_buf; | ||
|
|
||
| // if the entire payload is not received, wait for more data to arrive | ||
| if (jsbuf_len < hdr_len + payload_len) |
There was a problem hiding this comment.
The parser should only consume payload_len bytes from the request, the remaining data shall be interpreted as the beginning of a new one.
src/json.c
Outdated
| garray_T lspga; | ||
|
|
||
| ga_init2(&ga, 1, 4000); | ||
| json_encode_gap(&ga, val, 0); |
There was a problem hiding this comment.
If json_encode_gap fails an empty payload is sent, the error should be relayed to the caller.
e01c494 to
ea03d85
Compare
|
Hi,
On Tue, Mar 29, 2022 at 12:52 AM LemonBoy ***@***.***> wrote:
***@***.**** commented on this pull request.
------------------------------
In src/channel.c
<#10025 (comment)>:
> +
+ // Process each line in the header till an empty line is read (header
+ // separator).
+ while (TRUE)
+ {
+ line_start = p;
+ while (*p != NUL && *p != '\n')
+ p++;
+ if (*p == NUL) // partial header
+ return MAYBE;
+ p++;
+
+ // process the content length field (if present)
+ if ((p - line_start > 16)
+ && STRNICMP(line_start, "Content-Length: ", 16) == 0)
+ payload_len = atoi((char *)line_start + 16);
This line is quite dangerous, a negative content length can and will wreak
havock.
The use of atoi is also problematic, in case of overflow you may be
invoking undefined behaviour, plus the error checking is quite lacking.
This call can be replaced with the use of strtol (and proper error
checking, don't forget the check for ERANGE) and a check to make sure the
value is not negative.
I have changed atoi() to strtol() and added the valid header length checks.
Regards,
Yegappan
… Using unsigned variables for everything that represents a length is also
a good idea.
|
|
Hi,
On Tue, Mar 29, 2022 at 12:55 AM LemonBoy ***@***.***> wrote:
@LemonBoy commented on this pull request.
________________________________
In src/channel.c:
> + payload_len = atoi((char *)line_start + 16);
+
+ if ((line_start + 2) == p && line_start[0] == '\r' &&
+ line_start[1] == '\n')
+ // reached the empty line
+ break;
+ }
+
+ if (payload_len == -1)
+ // Content-Length field is not present in the header
+ return FAIL;
+
+ hdr_len = p - reader->js_buf;
+
+ // if the entire payload is not received, wait for more data to arrive
+ if (jsbuf_len < hdr_len + payload_len)
The parser should only consume payload_len bytes from the request, the remaining data shall be interpreted as the beginning of a new one.
The reader->js_end field is adjusted a few lines below the above line
so that only the
payload_len bytes are consumed from the request.
- Yegappan
|
05e0b0e to
f0ba3ab
Compare
|
Was about to play around and noticed that no release has been created for the past 7 days at https://github.com/vim/vim-win32-installer/releases. https://ci.appveyor.com/project/chrisbra/vim-win32-installer/builds/43143548/job/okn2ow3fvro3kxvj |
|
Hi,
On Tue, Apr 5, 2022 at 5:44 PM Prabir Shrestha ***@***.***> wrote:
Was about to play around and noticed that no release has been created for
the past 7 days at https://github.com/vim/vim-win32-installer/releases.
The latest MS-Windows Vim build (v8.2.4710) is available now. You should
be able to try out
the LSP support.
Regards,
Yegappan
|
|
I'm able to download the latest version of vim with lsp support on Windows. Thanks for the quick fix. ArchLinux vim also seems to be updated to include this patch. Looking forward to seeing plugin authors use this feature. I have started adding support for native lsp client in vim-lsp and I'm very happy with the simplicity and ease of use of the api. I have just got basic request/response working so this feedback is a bit limited while playing with the api for several mins.
let g:lsp_format_sync_timeout = 1000
autocmd! BufWritePre *.rs,*.go call execute('LspDocumentFormatSync')
|
|
I figured out how to get the id, but seems like the request and response id are not the same. req id seems to always be +1 of response id. let req = { 'method': 'initialize', 'params': {...} }
call ch_sendexpr(ch, req, { 'callback': function('s:cb', [req] })
call s:log(req) " req => { "method": "initialize", "jsonrpc": "2.0", "id": 2, "params": {} }
function! s:cb(req, ch, res) abort
" req => { "method": "initialize", "jsonrpc": "2.0", "id": 2, "params": {} }
" res => { "id": 1, "result": {}, "jsonrpc": "2.0" }
call s:log(req, res)
endfunction |
|
Seems like bug in my code. Ids are generated correctly but would be good to document that id in request will be added. |
|
Hi,
On Thu, Apr 14, 2022 at 10:34 PM Prabir Shrestha ***@***.***> wrote:
I'm able to download the latest version of vim with lsp support on
Windows. Thanks for the quick fix. ArchLinux vim also seems to be updated
to include this patch. Looking forward to seeing plugin authors use this
feature.
I have started adding support for native lsp client in vim-lsp
<https://github.com/prabirshrestha/vim-lsp/> and I'm very happy with the
simplicity and ease of use of the api.
Thanks for the feedback.
I have just got basic request/response working so this feedback is a bit
limited while playing with the api for several mins.
1. LSP support $/cancelRequest
<https://microsoft.github.io/language-server-protocol/specifications/specification-current/#cancelRequest>
which requires sending the request id. This is heavily used by vim-lsp in
order to cancel previous requests such as completion results that could had
changed, highlighting current keyword that could had changed due to the
cursor movements. Doc explicitly says Note that in the request message
the 'id' field should not be specified. Is it possible to have an api
to either set the id or get the id of the request?
Are you using the ch_evalexpr() function or the ch_sendexpr() function to
send the original request?
If you are using the ch_sendexpr() function, then you can specify an
identifier in the message. Vim will
not modify the identifier. If you are using the ch_evalexpr() function,
then Vim will return only when the
response message is received or the request times out. So you cannot cancel
a request sent
using the ch_evalexpr() function.
1. Is it possible to document what resp is in the example doc. let
resp = ch_evalexpr(ch, req, #{timeout: 100}). I also see this being
used in the test and it checks for empty object. I thought this was the
request id but always seem to be empty during my limited test. Probably
this only makes sense for synchronous request? It would still be good to
document that it will always be empty for async as it created a bit
confusion to me.
The ch_evalexpr() function returns the decoded response message returned
by the server
for the request message. In the test_channel.vim file, you will see a few
examples for
this (e.g.around line 2540). The ch_sendexpr() function always returns an
empty string.
I will update the help text to describe this.
1. Minor: Should the doc include the full raw request? {'jsonrpc':
'2.0', 'id': 1, 'result': { ...}}. I see this as useful doc when
writing remote LSP/Json RPC plugins.
Yes.
1. It took be a bit to figure out the callback args for ch_evalexpr
but seems like this is already documented at the end where it says to refer
to |channel-callback|. One thing that wasn't clear is if there is a JSON
RPC Error Message would it still call the callback function? Clarifying
that caller needs to handle the message whether it is success or error rpc
messages would be better here. In the future I can see having
on_success and on_error callbacks. I don't see me using this in
vim-lsp but do see it using in other plugins.
https://www.jsonrpc.org/specification#response_object
The callback function will be called for a JSON RPC error also. So it will
be called for both success and error.
I will update the doc to clarify this.
1. Can the sync and async method be used on the same channel?
Yes. You can use the ch_evalexpr(), ch_sendexpr() and ch_sendraw()
functions on the same
channel.
1. Because vim doesn't support async BufWritePre we recommend users to
use the following code to auto format on save. While 99% of the code will
be using async function there will be sync methods for writing.
let g:lsp_format_sync_timeout = 1000autocmd! BufWritePre *.rs,*.go call execute('LspDocumentFormatSync')
1. Is it possible to document the behavior of how to handle timeouts?
When using let resp = ch_evalexpr(ch, req, #{timeout: 100}) what error
is thrown and what error should I handle?
In the case of a timeout, an empty string will be returned. There won't be
any error messages.
I will update the doc.
Thanks,
Yegappan
|
|
Hi,
One correction below.
On Fri, Apr 15, 2022 at 8:12 AM Yegappan Lakshmanan ***@***.***> wrote:
Hi,
On Thu, Apr 14, 2022 at 10:34 PM Prabir Shrestha ***@***.***> wrote:
>
> I'm able to download the latest version of vim with lsp support on Windows. Thanks for the quick fix. ArchLinux vim also seems to be updated to include this patch. Looking forward to seeing plugin authors use this feature.
>
> I have started adding support for native lsp client in vim-lsp and I'm very happy with the simplicity and ease of use of the api.
Thanks for the feedback.
>
> I have just got basic request/response working so this feedback is a bit limited while playing with the api for several mins.
>
> LSP support $/cancelRequest which requires sending the request id. This is heavily used by vim-lsp in order to cancel previous requests such as completion results that could had changed, highlighting current keyword that could had changed due to the cursor movements. Doc explicitly says Note that in the request message the 'id' field should not be specified. Is it possible to have an api to either set the id or get the id of the request?
Are you using the ch_evalexpr() function or the ch_sendexpr() function to send the original request?
If you are using the ch_sendexpr() function, then you can specify an identifier in the message. Vim will
not modify the identifier. If you are using the ch_evalexpr() function, then Vim will return only when the
With the ch_sendexpr() function, if you specify a 'callback' function,
then Vim will use an
internally generated 'id' value and will overwrite the caller supplied
'id'. Currently there is
no support for getting the 'id' value used by the ch_sendexpr()
function. I think we can
return the 'id' value used if a callback function is specified.
- Yegappan
…
response message is received or the request times out. So you cannot cancel a request sent
using the ch_evalexpr() function.
>
> Is it possible to document what resp is in the example doc. let resp = ch_evalexpr(ch, req, #{timeout: 100}). I also see this being used in the test and it checks for empty object. I thought this was the request id but always seem to be empty during my limited test. Probably this only makes sense for synchronous request? It would still be good to document that it will always be empty for async as it created a bit confusion to me.
The ch_evalexpr() function returns the decoded response message returned by the server
for the request message. In the test_channel.vim file, you will see a few examples for
this (e.g.around line 2540). The ch_sendexpr() function always returns an empty string.
I will update the help text to describe this.
> Minor: Should the doc include the full raw request? {'jsonrpc': '2.0', 'id': 1, 'result': { ...}}. I see this as useful doc when writing remote LSP/Json RPC plugins.
Yes.
>
> It took be a bit to figure out the callback args for ch_evalexpr but seems like this is already documented at the end where it says to refer to |channel-callback|. One thing that wasn't clear is if there is a JSON RPC Error Message would it still call the callback function? Clarifying that caller needs to handle the message whether it is success or error rpc messages would be better here. In the future I can see having on_success and on_error callbacks. I don't see me using this in vim-lsp but do see it using in other plugins. https://www.jsonrpc.org/specification#response_object
The callback function will be called for a JSON RPC error also. So it will be called for both success and error.
I will update the doc to clarify this.
>
> Can the sync and async method be used on the same channel?
Yes. You can use the ch_evalexpr(), ch_sendexpr() and ch_sendraw() functions on the same
channel.
>
> Because vim doesn't support async BufWritePre we recommend users to use the following code to auto format on save. While 99% of the code will be using async function there will be sync methods for writing.
>
> let g:lsp_format_sync_timeout = 1000
> autocmd! BufWritePre *.rs,*.go call execute('LspDocumentFormatSync')
>
> Is it possible to document the behavior of how to handle timeouts? When using let resp = ch_evalexpr(ch, req, #{timeout: 100}) what error is thrown and what error should I handle?
>
>
In the case of a timeout, an empty string will be returned. There won't be any error messages.
I will update the doc.
Thanks,
Yegappan
|
|
Hi,
On Thu, Apr 14, 2022 at 10:34 PM Prabir Shrestha ***@***.***> wrote:
I'm able to download the latest version of vim with lsp support on
Windows. Thanks for the quick fix. ArchLinux vim also seems to be updated
to include this patch. Looking forward to seeing plugin authors use this
feature.
I have started adding support for native lsp client in vim-lsp
<https://github.com/prabirshrestha/vim-lsp/> and I'm very happy with the
simplicity and ease of use of the api.
I have just got basic request/response working so this feedback is a bit
limited while playing with the api for several mins.
1. LSP support $/cancelRequest
<https://microsoft.github.io/language-server-protocol/specifications/specification-current/#cancelRequest>
which requires sending the request id. This is heavily used by vim-lsp in
order to cancel previous requests such as completion results that could had
changed, highlighting current keyword that could had changed due to the
cursor movements. Doc explicitly says Note that in the request message
the 'id' field should not be specified. Is it possible to have an api
to either set the id or get the id of the request?
Patch 8.2.4758 (
3b470ae)
added
the support for returning the message ID for a request message sent using
ch_sendexpr(). The help text
is also updated to describe how to use these functions. Let me know if you
have any additional comments.
Regards,
Yegappan
…
1. Is it possible to document what resp is in the example doc. let
resp = ch_evalexpr(ch, req, #{timeout: 100}). I also see this being
used in the test and it checks for empty object. I thought this was the
request id but always seem to be empty during my limited test. Probably
this only makes sense for synchronous request? It would still be good to
document that it will always be empty for async as it created a bit
confusion to me.
2. Minor: Should the doc include the full raw request? {'jsonrpc':
'2.0', 'id': 1, 'result': { ...}}. I see this as useful doc when
writing remote LSP/Json RPC plugins.
3. It took be a bit to figure out the callback args for ch_evalexpr
but seems like this is already documented at the end where it says to refer
to |channel-callback|. One thing that wasn't clear is if there is a JSON
RPC Error Message would it still call the callback function? Clarifying
that caller needs to handle the message whether it is success or error rpc
messages would be better here. In the future I can see having
on_success and on_error callbacks. I don't see me using this in
vim-lsp but do see it using in other plugins.
https://www.jsonrpc.org/specification#response_object
4. Can the sync and async method be used on the same channel? Because
vim doesn't support async BufWritePre we recommend users to use the
following code to auto format on save. While 99% of the code will be using
async function there will be sync methods for writing.
let g:lsp_format_sync_timeout = 1000autocmd! BufWritePre *.rs,*.go call execute('LspDocumentFormatSync')
1. Is it possible to document the behavior of how to handle timeouts?
When using let resp = ch_evalexpr(ch, req, #{timeout: 100}) what error
is thrown and what error should I handle?
|
|
Thanks for the quick updates on docs. I'm working on a native-lsp-client branch in vim-lsp but not able to get completion items working. There could be a potential parsing bug. Here is the link to the full channel log: https://gist.github.com/prabirshrestha/74d69fe9df0a85126727e99d09ddecbb I see a bunch of The language server is returning the completion items, but since the parsing fails I can never get the message. The repository I'm trying is a simple hello world rust program created via I also see |
|
Hi,
On Sat, Apr 16, 2022 at 2:10 PM Prabir Shrestha ***@***.***> wrote:
Thanks for the quick updates on docs.
I'm working on a native-lsp-client branch
<https://github.com/prabirshrestha/vim-lsp/tree/native-lsp-client> in
vim-lsp but not able to get completion items working. There could be a
potential parsing bug.
Here is the link to the full channel log:
https://gist.github.com/prabirshrestha/74d69fe9df0a85126727e99d09ddecbb
Thanks for reporting the issue. It looks like if the HTTP header is
received separately
from the JSON payload, then there is a problem in processing the response
properly.
I am able to reproduce this issue with the test program. I need to debug
this more.
Regards,
Yegappan
… The language server is returning the completion items, but since the
parsing fails I can never get the message.
The repository I'm trying is a simple hello world rust program created via cargo
new helloworld and try to type and see the completion items.
I also see timed out messages, Do we need to configure some timeout
options here?
In one of the logs I see partial messages.
https://gist.github.com/prabirshrestha/74d69fe9df0a85126727e99d09ddecbb#file-channel-log-L252
<https://gist.github.com/prabirshrestha/74d69fe9df0a85126727e99d09ddecbb#file-channel-log-L233-L266>
|
|
Glad that you have a repro. Do watch out for perf issues though this can be fixed along the way in the near future while the parsing bug is fixed. In vim-lsp we have a pretty bad perf due to parsing where we concat strings and this causes lot of memory allocation. This threads might be relevant:
You might also be interested in how vscode lang server implements it. They use a MessageBuffer and MessageReader and directly reading as bytes with |
|
Hi Prabir,
On Sat, Apr 16, 2022 at 11:06 PM Prabir Shrestha ***@***.***> wrote:
Glad that you have a repro. Do watch out for perf issues though this can
be fixed along the way in the near future while the parsing bug is fixed.
In vim-lsp we have a pretty bad perf due to parsing where we concat strings
and this causes lot of memory allocation.
This threads might be relevant:
- prabirshrestha/vim-lsp#1073
<prabirshrestha/vim-lsp#1073>
- ***@***.***
<hrsh7th/vim-vital-vs@6af4d34>
You might also be interested in how vscode lang server implements it. They
use a MessageBuffer
<https://github.com/microsoft/vscode-languageserver-node/blob/main/jsonrpc/src/common/messageBuffer.ts>
and MessageReader
<https://github.com/microsoft/vscode-languageserver-node/blob/main/jsonrpc/src/common/messageReader.ts>
and directly reading as bytes with allocNative.
I have created PR #10215 with a fix for
this. Can you try
it out and let me know if it works with your tests?
Thanks,
Yegappan
|
|
I have tried the PR and it correctly parses the LSP message. Thanks for the quick fix! Found another bug. Doc says that But from my experience seems like it may return a string if the server writes to Here is when a Here is when let l:jobopt = { 'in_mode': 'lsp', 'out_mode': 'lsp', 'noblock': 1, 'callback': function('s:native_notification_callback') }
let l:job = job_start(a:opts['cmd'], l:jobopt)
function! s:native_notification_callback(channel, response) abort
call s:log('s:s:native_notification_callback', a:response)
if type(a:response) == type({})
" a:response is json decoded lsp message
else
" a:respone is string from stderr
endif
endfunction |
|
Hi,
On Sun, Apr 17, 2022 at 3:04 PM Prabir Shrestha ***@***.***> wrote:
I have tried the PR and it correctly parses the LSP message. Thanks for
the quick fix!
Thanks for verifying the fix.
Found another bug. Doc says that channel callback will return decoded
json rpc message as vim dict.
https://github.com/yegappan/vim/blob/aea71ca1f496c89dbfd127376368bb8c7c693782/runtime/doc/channel.txt#L1415-L1422
We should call this a doc bug. Depending on the mode specified for stderr,
the callback
function can be called with either a string or a dict. To handle LSP
messages on stdout and
strings on stderr, it is better to specify separate callback functions for
"out_cb" and
"err_cb" (as you have described below). I will update the help to mention
this.
Regards,
Yegappan
… But from my experience seems like it may return a string if the server
writes to stderr. I did a quickfix in vim-lsp to always check the type to
be dict. ***@***.***
<prabirshrestha/vim-lsp@71efb2d>.
I think the better option would be to never call the callback for stderr
and ask the caller to attach out_err instead.
Here is when a callback response is string.
Sun Apr 17 14:58:54 2022:["s:native_notification_callback","2022/04/17 14:58:54 efm-langserver: reading on stdin, writing on stdout"]
Here is when callback response is a dict
Sun Apr 17 14:59:25 2022:["s:native_notification_callback",{"id":0,"jsonrpc":"2.0","method":"window/workDoneProgress/create","params":{"token":"rustAnalyzer/Fetching"}}]
let l:jobopt = { 'in_mode': 'lsp', 'out_mode': 'lsp', 'noblock': 1, 'callback': function('s:native_notification_callback') }let l:job = job_start(a:opts['cmd'], l:jobopt)
function! s:native_notification_callback(channel, response) abort
call s:log('s:s:native_notification_callback', a:response)
if type(a:response) == type({}) " a:response is json decoded lsp message
else " a:respone is string from stderr
endifendfunction
|
|
I'm unblocked now that I can use |
|
Hi Prabir,
On Sun, Apr 17, 2022 at 3:04 PM Prabir Shrestha ***@***.***> wrote:
I have tried the PR and it correctly parses the LSP message. Thanks for
the quick fix!
Do we need any further changes to the LSP APIs or the doc? Are you able
to convert
the vim-lsp plugin (
https://github.com/prabirshrestha/vim-lsp/tree/native-lsp-client) to use
the new APIs?
Thanks,
Yegappan
|
|
I have been using native-lsp-client in vim-lsp for several days and haven't seen any issues with it. I should be all good. I need to rector code on my side and add support for tcp to be on par with the master branch before I can merge it. My next would then be to profile and see the native client's perf improvements but this probably few weeks later. I usually use the typescript emiter.ts file for perf benchmarks on lsp as it is ~6k of real code in case you want to ahead and try it out. |
|
Add basic support for processing language server protocol messages.
The protocol is described in:
https://microsoft.github.io/language-server-protocol/specification.
The existing support in Vim for processing JSON encoded messages
doesn't work for language server protocol. The LSP messages start
with a HTTP header followed by a JSON encoded dictionary. This format
is currently not supported by the Vim channel "json" mode.
This PR adds a new "lsp" channel mode to process the LSP header. A LSP client
can use ch_sendexpr() to send a Vim expression to the LSP server and
received messages will be decoded and the callback function will be invoked
with a Vim type.
I have tested this with the Vim9 LSP client and it works. But this needs more
extensive testing.
The RPC semantic is not yet supported.
Should be OK to support this, it should make handling the messages
faster (did you measure how much?).
A detail: The header is not a "HTTP header", that would suggest many
HTTP fields would be supported. Better is "HTTP semantic". It
basically means a field name, colon and value terminated by CR NL.
I suggest that in channel_skip_lsp_http_hdr() a loop is used to skip
lines until an empty one is found. I assume a future version may add
another header field, we should ignore it, so that and older Vim works
with a newer LSP server.
…--
Q: How many hardware engineers does it take to change a lightbulb?
A: None. We'll fix it in software.
/// Bram Moolenaar -- ***@***.*** -- http://www.Moolenaar.net \\\
/// \\\
\\\ sponsor Vim, vote for features -- http://www.Vim.org/sponsor/ ///
\\\ help me help AIDS victims -- http://ICCF-Holland.org ///
|
Add support for processing language server protocol encoded messages. The protocol is
described in: https://microsoft.github.io/language-server-protocol/specification.
The existing support in Vim for processing JSON encoded messages doesn't work
for language server protocol. The LSP messages start with a HTTP header followed
by a JSON encoded dictionary. This format is currently not supported by the Vim
channel "json" mode.
This PR adds a new "lsp" channel mode to process the LSP header. A LSP client
can use the ch_sendexpr() and ch_evalexpr() functions to send a Vim expression to
the LSP server and receive the decoded responses.
I have tested this with the Vim9 LSP client and it works.
This PR adds the features requested in #9390.