Skip to content

Commit 9c7d799

Browse files
authored
Merge pull request #738 from macrocosm-os/codex/fix-spelling-mistakes-in-readme
Fix README typos
2 parents af2c3c4 + 6e72dcb commit 9c7d799

File tree

3 files changed

+9
-9
lines changed

3 files changed

+9
-9
lines changed

README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -42,14 +42,14 @@ For a more detailed read, check our docs here: [docs.macrocosmos.ai/subnets/subn
4242
You can also access Subnet 1, Apex via the API. Find out more here: [docs.macrocosmos.ai/developers/api-documentation/sn1-apex](https://docs.macrocosmos.ai/developers/api-documentation/sn1-apex)
4343

4444
### Mission Commander
45-
This is an agentic LLM chatbot built into Gravity, designed to help you pick the right terms and phrases for your data-scraping needs. Simply tell it what information you want and it'll offer suggestions and help with brainstorming. Mission Commander is built with Subnet 1, Apex, also owned by Macrocosmos. It lowers the barrier-of-entry even.
45+
This is an agentic LLM chatbot built into Gravity, designed to help you pick the right terms and phrases for your data-scraping needs. Simply tell it what information you want and it'll offer suggestions and help with brainstorming. Mission Commander is built with Subnet 1, Apex, also owned by Macrocosmos. It lowers the barrier to entry even further.
4646

4747
Try Mission Commander via Gravity here: [app.macrocosmos.ai/gravity](http://app.macrocosmos.ai/gravity)
4848

4949
For a more detailed read, check our docs here: [docs.macrocosmos.ai/constellation-user-guides/gravity](https://docs.macrocosmos.ai/constellation-user-guides/gravity)
5050

5151
### MCP (Macrocosmos Connect Protocol)
52-
You can integrate Subnet 1, Apex, directly into Claude and Cursor via our MCP. This allows you to access our web-search options and inference via other routes, rather than only from our website. Will provide URL to Apex, api key, and a guide how to use the model.
52+
You can integrate Subnet 1, Apex, directly into Claude and Cursor via our MCP. This allows you to access our web-search options and inference via other routes, rather than only from our website. It will provide a URL to Apex, an API key, and a guide on how to use the model.
5353

5454
Try the MCP by following our guide here: [docs.macrocosmos.ai/developers/tools/macrocosmos-mcp](https://docs.macrocosmos.ai/developers/tools/macrocosmos-mcp)
5555

@@ -86,7 +86,7 @@ Apex has the potential to become the flagship decentralized LLM experience acros
8686

8787

8888
## The Team Behind Subnet 1
89-
Subnet 1 was built by Dr. Steffen Cruz, AKA @Macrocrux, when he was CTO of Bittensor. Steffen has led Apex through multiple iterations, overseeing its evolution into Bittensor's premiere provider of decentralized intelligence.
89+
Subnet 1 was built by Dr. Steffen Cruz, AKA @Macrocrux, when he was CTO of Bittensor. Steffen has led Apex through multiple iterations, overseeing its evolution into Bittensor's premier provider of decentralized intelligence.
9090

9191
Apex's engineering team is one of the most impressive on Bittensor. It includes Felix Quinque, who led its Chain of Thought, Reasoning, and Logits upgrades, Dmytro Bobrenko with Organic Scoring and DeepResearcher, Rich Wardle's research and development, and Kalei Brady, who led GAN based architecture upgrade and leads SN1's Discord Community. It also receives the support of other Macrocosmos engineers, ensuring that Subnet 1 is one of the best-staffed projects on the protocol - all of which helps ensure its long-term viability.
9292

docs/stream_miner_template.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Creating Stream Miners
22

3-
Miners for SN1 **must** support the StreamPromptingSynapse. This enables all miners on the network to stream batches of tokens to the validator. This has clear beneifts, such as:
3+
Miners for SN1 **must** support the StreamPromptingSynapse. This enables all miners on the network to stream batches of tokens to the validator. This has clear benefits, such as:
44

55
1. Getting rewards for partial responses, and
66
2. Enabling better user-product interactivity when using a frontend.
@@ -9,10 +9,10 @@ Getting custom miners to use streaming is a large engineering effort. To make th
99

1010
## Architecture
1111

12-
Miner architectures require that you are running a syncronous `forward` method, with an internal `async _forward` function. The code below provides a basic outline of how the `async _forward` function should be structured. There are two main points here:
12+
Miner architectures require that you are running a synchronous `forward` method, with an internal `async _forward` function. The code below provides a basic outline of how the `async _forward` function should be structured. There are two main points here:
1313

1414
1. Adding data to the buffer and sending it when it reaches the `config.neuron.streaming_batch_size`
15-
2. Sending the final buffer of data if inference is finished, and there are less tokens than the batch size.
15+
2. Sending the final buffer of data if inference is finished, and there are fewer tokens than the batch size.
1616

1717
```python
1818
def forward(self, synapse: StreamPromptingSynapse) -> Awaitable:
@@ -80,8 +80,8 @@ def forward(self, synapse: StreamPromptingSynapse) -> Awaitable:
8080

8181
HuggingFace miners require you to run a separate inference thread in the background, add to a queue, and manually clear it at the end of the `async _forward` method.
8282

83-
This branch contains multiple inplementations. To see:
83+
This branch contains multiple implementations. To see:
8484
1. Langchain+OpenAI implementation, refer to `prompting/miners/openai_miner.py`
8585
2. HuggingFace implementation, refer to `prompting/miners/hf_miner.py`
8686

87-
It is **necessary** that forward method of the miner class returns this `synapse.create_streaming_response(token_streamer)`. As seen, the `token_streamer` is a partial function that takes in a `send` packet. This packet will be sent by the bittensor middleware to facilitate the communications between the validator and the miner. You do **not** need to modify any logic around the `send` packet, as this is the same for **all** miners.
87+
It is **necessary** that the forward method of the miner class returns this `synapse.create_streaming_response(token_streamer)`. As seen, the `token_streamer` is a partial function that takes in a `send` packet. This packet will be sent by the bittensor middleware to facilitate the communications between the validator and the miner. You do **not** need to modify any logic around the `send` packet, as this is the same for **all** miners.

shared/exceptions.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ def __init__(self, message="Maximum number of retries exceeded"):
99
class BittensorError(Exception):
1010
"""Exception raised when an error is raised from the bittensor package"""
1111

12-
def __init__(self, message="An error from the Bittensor package occured"):
12+
def __init__(self, message="An error from the Bittensor package occurred"):
1313
self.message = message
1414
super().__init__(self.message)
1515

0 commit comments

Comments
 (0)