Pull to refresh
25
-1
Sergei Golitsyn @deft31

Software Engineer

Send message
Есть куски, взятые из документации, которые было сложно перефразировать.

По поводу скорости компрессии — вот, думаю этого достаточно? auth0.com/blog/beating-json-performance-with-protobuf
На самом деле если использовать рест по верх hhtp2 то gRPC изначально не будет быстрее, но вот достаточно подробный ответ:
> elective message compression. In gRPC a streaming RPC can decide to compress or not compress messages. For example, if you are streaming mixed text and images over a single stream (or really any mixed compressible content), you can turn off compression for the images. This saves you from compressing already compressed data which won't get any smaller, but will burn up your CPU.
First class load balancing. While not an improvement in point to point connections, gRPC can intelligently pick which backend to send traffic to. (this is a library feature, not a wire protocol feature). This means you can send your requests to the least loaded backend server without resorting to using a proxy. This is a latency win.
Heavily optimized. gRPC (the library) is under continuous benchmarks to ensure that there are no speed regressions. Those benchmarks are improving constantly. Again, this doesn't have anything to do with gRPC the protocol, but your program will be faster for having used gRPC.
As nfirvine said, you will see most of your performance improvement just from using Protobuf. While you could use proto with REST, it is very nicely integrated with gRPC. Technically, you could use JSON with gRPC, but most people don't want to pay the performance cost after getting used to protos.
grpc.io/blog/mobile-benchmarks

> We found that regardless of the serialization/deserialization method used for protobuf, it was consistently about 3x faster for serializing than JSON. For deserialization, JSON is actually a bit faster for small messages (<1kb), around 1.5x, but for larger messages (>15kb) protobuf is 2x faster. For gzipped JSON, protobuf is well over 5x faster in serialization, regardless of size. For deserialization, both are about the same at small messages, but protobuf is about 3x faster for larger messages.

И это только про сериализацию.
В гитхабе как я указал можно поставить свой раннер и минуты станут бесконечными.

— В гитлабе не экшены
Вот ссылочка. Там экшены:
docs.gitlab.com/ee////////user/project/quick_actions.html
была опечатка, поправил. спасибо.
Пасиб, поправил
Да вроде это мой кривой русский
У меня такое было по началу с GitLab =) везде sleep ставил, что бы все успевало переподняться.
2

Information

Rating
Does not participate
Date of birth
Registered
Activity