diff --git a/website/src/pages/ar/substreams/sps/_meta.js b/website/src/pages/ar/substreams/sps/_meta.js deleted file mode 100644 index 86fcd3df5ec0..000000000000 --- a/website/src/pages/ar/substreams/sps/_meta.js +++ /dev/null @@ -1,6 +0,0 @@ -export default { - introduction: '', - triggers: '', - tutorial: '', - faq: '', -} diff --git a/website/src/pages/ar/substreams/sps/faq.mdx b/website/src/pages/ar/substreams/sps/faq.mdx deleted file mode 100644 index c19b0a950297..000000000000 --- a/website/src/pages/ar/substreams/sps/faq.mdx +++ /dev/null @@ -1,96 +0,0 @@ ---- -title: Substreams-Powered Subgraphs FAQ -sidebarTitle: FAQ ---- - -## What are Substreams? - -Substreams is an exceptionally powerful processing engine capable of consuming rich streams of blockchain data. It allows you to refine and shape blockchain data for fast and seamless digestion by end-user applications. - -Specifically, it's a blockchain-agnostic, parallelized, and streaming-first engine that serves as a blockchain data transformation layer. It's powered by [Firehose](https://firehose.streamingfast.io/), and enables developers to write Rust modules, build upon community modules, provide extremely high-performance indexing, and [sink](/substreams/developing/sinks/) their data anywhere. - -Substreams is developed by [StreamingFast](https://www.streamingfast.io/). Visit the [Substreams Documentation](/substreams/introduction/) to learn more about Substreams. - -## What are Substreams-powered Subgraphs? - -[Substreams-powered Subgraphs](/sps/introduction/) combine the power of Substreams with the queryability of Subgraphs. When publishing a Substreams-powered Subgraph, the data produced by the Substreams transformations, can [output entity changes](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs) that are compatible with Subgraph entities. - -If you are already familiar with Subgraph development, note that Substreams-powered Subgraphs can then be queried just as if it had been produced by the AssemblyScript transformation layer. This provides all the benefits of Subgraphs, including a dynamic and flexible GraphQL API. - -## How are Substreams-powered Subgraphs different from Subgraphs? - -Subgraphs are made up of datasources which specify onchain events, and how those events should be transformed via handlers written in Assemblyscript. These events are processed sequentially, based on the order in which events happen onchain. - -By contrast, substreams-powered Subgraphs have a single datasource which references a substreams package, which is processed by the Graph Node. Substreams have access to additional granular onchain data compared to conventional Subgraphs, and can also benefit from massively parallelised processing, which can mean much faster processing times. - -## What are the benefits of using Substreams-powered Subgraphs? - -Substreams-powered Subgraphs combine all the benefits of Substreams with the queryability of Subgraphs. They bring greater composability and high-performance indexing to The Graph. They also enable new data use cases; for example, once you've built your Substreams-powered Subgraph, you can reuse your [Substreams modules](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) to output to different [sinks](https://substreams.streamingfast.io/reference-and-specs/manifests#sink) such as PostgreSQL, MongoDB, and Kafka. - -## ماهي فوائد سبستريمز؟ - -There are many benefits to using Substreams, including: - -- Composable: You can stack Substreams modules like LEGO blocks, and build upon community modules, further refining public data. - -- High-performance indexing: Orders of magnitude faster indexing through large-scale clusters of parallel operations (think BigQuery). - -- Sink anywhere: Sink your data to anywhere you want: PostgreSQL, MongoDB, Kafka, Subgraphs, flat files, Google Sheets. - -- Programmable: Use code to customize extraction, do transformation-time aggregations, and model your output for multiple sinks. - -- الوصول إلى بيانات إضافية غير متاحة كجزء من إجراء الإستدعاء عن بعد للترميز الكائني لجافاسكريبت - -- All the benefits of the Firehose. - -## What is the Firehose? - -تم تطوير فايرهوز بواسطة [StreamingFast] (https://www.streamingfast.io/) وهو طبقة استخراج بيانات سلاسل الكتل مصممة من الصفر لمعالجة كامل تاريخ سلاسل الكتل بسرعات لم يشهدها من قبل. يوفر نهجاً قائماً على الملفات وأولوية-التدفق، وهو مكون أساسي في مجموعة تقنيات ستريمنج فاست مفتوحة المصدر والأساس لسبستريمز. - -انتقل إلى [الوثائق](https://firehose.streamingfast.io/) لمعرفة المزيد حول فايرهوز. - -## What are the benefits of the Firehose? - -There are many benefits to using Firehose, including: - -- أقل تأخير وعدم الاستقصاء: بطريقة قائمة على أولوية-التدفق، تم تصميم نقاط فايرهوز للتسابق لدفع بيانات الكتلة أولاً. - -- Prevents downtimes: Designed from the ground up for High Availability. - -- Never miss a beat: The Firehose stream cursor is designed to handle forks and to continue where you left off in any condition. - -- Richest data model:  Best data model that includes the balance changes, the full call tree, internal transactions, logs, storage changes, gas costs, and more. - -- يستفيد من الملفات المسطحة: يتم استخراج بيانات سلسلة الكتل إلى ملفات مسطحة، وهي أرخص وأكثر موارد الحوسبة تحسيناً. - -## Where can developers access more information about Substreams-powered Subgraphs and Substreams? - -The [Substreams documentation](/substreams/introduction/) will teach you how to build Substreams modules. - -The [Substreams-powered Subgraphs documentation](/sps/introduction/) will show you how to package them for deployment on The Graph. - -The [latest Substreams Codegen tool](https://streamingfastio.medium.com/substreams-codegen-no-code-tool-to-bootstrap-your-project-a11efe0378c6) will allow you to bootstrap a Substreams project without any code. - -## What is the role of Rust modules in Substreams? - -Rust modules are the equivalent of the AssemblyScript mappers in Subgraphs. They are compiled to WASM in a similar way, but the programming model allows for parallel execution. They define the sort of transformations and aggregations you want to apply to the raw blockchain data. - -See [modules documentation](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) for details. - -## What makes Substreams composable? - -When using Substreams, the composition happens at the transformation layer enabling cached modules to be re-used. - -As an example, Alice can build a DEX price module, Bob can use it to build a volume aggregator for some tokens of his interest, and Lisa can combine four individual DEX price modules to create a price oracle. A single Substreams request will package all of these individual's modules, link them together, to offer a much more refined stream of data. That stream can then be used to populate a Subgraph, and be queried by consumers. - -## كيف يمكنك إنشاء ونشر غراف فرعي مدعوم بسبستريمز؟ - -After [defining](/sps/introduction/) a Substreams-powered Subgraph, you can use the Graph CLI to deploy it in [Subgraph Studio](https://thegraph.com/studio/). - -## Where can I find examples of Substreams and Substreams-powered Subgraphs? - -You can visit [this Github repo](https://github.com/pinax-network/awesome-substreams) to find examples of Substreams and Substreams-powered Subgraphs. - -## What do Substreams and Substreams-powered Subgraphs mean for The Graph Network? - -إن التكامل مع سبستريمز والغرافات الفرعية المدعومة بسبستريمز واعدة بالعديد من الفوائد، بما في ذلك عمليات فهرسة عالية الأداء وقابلية أكبر للتركيبية من خلال استخدام وحدات المجتمع والبناء عليها. diff --git a/website/src/pages/ar/substreams/sps/introduction.mdx b/website/src/pages/ar/substreams/sps/introduction.mdx deleted file mode 100644 index e74abf2f0998..000000000000 --- a/website/src/pages/ar/substreams/sps/introduction.mdx +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: Introduction to Substreams-Powered Subgraphs -sidebarTitle: مقدمة ---- - -Boost your Subgraph's efficiency and scalability by using [Substreams](/substreams/introduction/) to stream pre-indexed blockchain data. - -## نظره عامة - -Use a Substreams package (`.spkg`) as a data source to give your Subgraph access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. - -### Specifics - -There are two methods of enabling this technology: - -1. **Using Substreams [triggers](/sps/triggers/)**: Consume from any Substreams module by importing the Protobuf model through a Subgraph handler and move all your logic into a Subgraph. This method creates the Subgraph entities directly in the Subgraph. - -2. **Using [Entity Changes](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out)**: By writing more of the logic into Substreams, you can consume the module's output directly into [graph-node](/indexing/tooling/graph-node/). In graph-node, you can use the Substreams data to create your Subgraph entities. - -You can choose where to place your logic, either in the Subgraph or Substreams. However, consider what aligns with your data needs, as Substreams has a parallelized model, and triggers are consumed linearly in the graph node. - -### مصادر إضافية - -Visit the following links for tutorials on using code-generation tooling to build your first end-to-end Substreams project quickly: - -- [Solana](/substreams/developing/solana/transactions/) -- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm) -- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet) -- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective) -- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra) -- [Stellar](https://docs.substreams.dev/tutorials/intro-to-tutorials/stellar) diff --git a/website/src/pages/ar/substreams/sps/triggers.mdx b/website/src/pages/ar/substreams/sps/triggers.mdx deleted file mode 100644 index 1bf1a2cf3f51..000000000000 --- a/website/src/pages/ar/substreams/sps/triggers.mdx +++ /dev/null @@ -1,47 +0,0 @@ ---- -title: Substreams Triggers ---- - -Use Custom Triggers and enable the full use GraphQL. - -## نظره عامة - -Custom Triggers allow you to send data directly into your Subgraph mappings file and entities, which are similar to tables and fields. This enables you to fully use the GraphQL layer. - -By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data in your Subgraph's handler. This ensures efficient and streamlined data management within the Subgraph framework. - -### Defining `handleTransactions` - -The following code demonstrates how to define a `handleTransactions` function in a Subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new Subgraph entity is created. - -```tsx -export function handleTransactions(bytes: Uint8Array): void { - let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).transactions // 1. - if (transactions.length == 0) { - log.info('No transactions found', []) - return - } - - for (let i = 0; i < transactions.length; i++) { - // 2. - let transaction = transactions[i] - - let entity = new Transaction(transaction.hash) // 3. - entity.from = transaction.from - entity.to = transaction.to - entity.save() - } -} -``` - -Here's what you're seeing in the `mappings.ts` file: - -1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object -2. Looping over the transactions -3. Create a new Subgraph entity for every transaction - -To go through a detailed example of a trigger-based Subgraph, [check out the tutorial](/sps/tutorial/). - -### مصادر إضافية - -To scaffold your first project in the Development Container, check out one of the [How-To Guide](/substreams/developing/dev-container/). diff --git a/website/src/pages/ar/substreams/sps/tutorial.mdx b/website/src/pages/ar/substreams/sps/tutorial.mdx deleted file mode 100644 index c41b10d885cd..000000000000 --- a/website/src/pages/ar/substreams/sps/tutorial.mdx +++ /dev/null @@ -1,155 +0,0 @@ ---- -title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' -sidebarTitle: Tutorial ---- - -Successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. - -## Get Started - -For a video tutorial, check out [How to Index Solana with a Substreams-powered Subgraph](/sps/tutorial/#video-tutorial) - -### Prerequisites - -Before starting, make sure to: - -- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. -- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. - -### Step 1: Initialize Your Project - -1. Open your Dev Container and run the following command to initialize your project: - - ```bash - substreams init - ``` - -2. Select the "minimal" project option. - -3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: - -```yaml -specVersion: v0.1.0 -package: - name: my_project_sol - version: v0.1.0 - -imports: # Pass your spkg of interest - solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg - -modules: - - name: map_spl_transfers - use: solana:map_block # Select corresponding modules available within your spkg - initialBlock: 260000082 - - - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes - -network: solana-mainnet-beta - -params: # Modify the param fields to meet your needs - # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA - map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE -``` - -### Step 2: Generate the Subgraph Manifest - -Once the project is initialized, generate a Subgraph manifest by running the following command in the Dev Container: - -```bash -substreams codegen subgraph -``` - -You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: - -```yaml ---- -dataSources: - - kind: substreams - name: my_project_sol - network: solana-mainnet-beta - source: - package: - moduleName: map_spl_transfers # Module defined in the substreams.yaml - file: ./my-project-sol-v0.1.0.spkg - mapping: - apiVersion: 0.0.9 - kind: substreams/graph-entities - file: ./src/mappings.ts - handler: handleTriggers -``` - -### Step 3: Define Entities in `schema.graphql` - -Define the fields you want to save in your Subgraph entities by updating the `schema.graphql` file. - -Here is an example: - -```graphql -type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! -} -``` - -This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. - -### Step 4: Handle Substreams Data in `mappings.ts` - -With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. - -The example below demonstrates how to extract to Subgraph entities the non-derived transfers associated to the Orca account id: - -```ts -import { Protobuf } from 'as-proto/assembly' -import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' -import { MyTransfer } from '../generated/schema' - -export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) - - for (let i = 0; i < input.data.length; i++) { - const event = input.data[i] - - if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}` - const entity = new MyTransfer(entity_id) - entity.amount = event.transfer!.instruction!.amount.toString() - entity.source = event.transfer!.accounts!.source - entity.designation = event.transfer!.accounts!.destination - - if (event.transfer!.accounts!.signer!.single != null) { - entity.signers = [event.transfer!.accounts!.signer!.single!.signer] - } else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers - } - entity.save() - } - } -} -``` - -### Step 5: Generate Protobuf Files - -To generate Protobuf objects in AssemblyScript, run the following command: - -```bash -npm run protogen -``` - -This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the Subgraph's handler. - -### Conclusion - -Congratulations! You've successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. You can take the next step by customizing your schema, mappings, and modules to fit your specific use case. - -### Video Tutorial - - - -### مصادر إضافية - -For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/src/pages/cs/substreams/sps/_meta.js b/website/src/pages/cs/substreams/sps/_meta.js deleted file mode 100644 index 86fcd3df5ec0..000000000000 --- a/website/src/pages/cs/substreams/sps/_meta.js +++ /dev/null @@ -1,6 +0,0 @@ -export default { - introduction: '', - triggers: '', - tutorial: '', - faq: '', -} diff --git a/website/src/pages/cs/substreams/sps/faq.mdx b/website/src/pages/cs/substreams/sps/faq.mdx deleted file mode 100644 index 25e77dc3c7f1..000000000000 --- a/website/src/pages/cs/substreams/sps/faq.mdx +++ /dev/null @@ -1,96 +0,0 @@ ---- -title: Substreams-Powered Subgraphs FAQ -sidebarTitle: FAQ ---- - -## Co jsou substreamu? - -Substreams is an exceptionally powerful processing engine capable of consuming rich streams of blockchain data. It allows you to refine and shape blockchain data for fast and seamless digestion by end-user applications. - -Specifically, it's a blockchain-agnostic, parallelized, and streaming-first engine that serves as a blockchain data transformation layer. It's powered by [Firehose](https://firehose.streamingfast.io/), and enables developers to write Rust modules, build upon community modules, provide extremely high-performance indexing, and [sink](/substreams/developing/sinks/) their data anywhere. - -Substreams is developed by [StreamingFast](https://www.streamingfast.io/). Visit the [Substreams Documentation](/substreams/introduction/) to learn more about Substreams. - -## What are Substreams-powered Subgraphs? - -[Substreams-powered Subgraphs](/sps/introduction/) combine the power of Substreams with the queryability of Subgraphs. When publishing a Substreams-powered Subgraph, the data produced by the Substreams transformations, can [output entity changes](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs) that are compatible with Subgraph entities. - -If you are already familiar with Subgraph development, note that Substreams-powered Subgraphs can then be queried just as if it had been produced by the AssemblyScript transformation layer. This provides all the benefits of Subgraphs, including a dynamic and flexible GraphQL API. - -## How are Substreams-powered Subgraphs different from Subgraphs? - -Subgraphs are made up of datasources which specify onchain events, and how those events should be transformed via handlers written in Assemblyscript. These events are processed sequentially, based on the order in which events happen onchain. - -By contrast, substreams-powered Subgraphs have a single datasource which references a substreams package, which is processed by the Graph Node. Substreams have access to additional granular onchain data compared to conventional Subgraphs, and can also benefit from massively parallelised processing, which can mean much faster processing times. - -## What are the benefits of using Substreams-powered Subgraphs? - -Substreams-powered Subgraphs combine all the benefits of Substreams with the queryability of Subgraphs. They bring greater composability and high-performance indexing to The Graph. They also enable new data use cases; for example, once you've built your Substreams-powered Subgraph, you can reuse your [Substreams modules](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) to output to different [sinks](https://substreams.streamingfast.io/reference-and-specs/manifests#sink) such as PostgreSQL, MongoDB, and Kafka. - -## Jaké jsou výhody Substreams? - -Používání ubstreams má mnoho výhod, mimo jiné: - -- Složitelný: Moduly Substreams můžete skládat na sebe jako kostky LEGO, stavět na komunitních moduly a dále vylepšovat veřejná data. - -- Vysoce výkonné indexování: Řádově rychlejší indexování prostřednictvím rozsáhlých klastrů paralelních operací (viz BigQuery). - -- Sink anywhere: Sink your data to anywhere you want: PostgreSQL, MongoDB, Kafka, Subgraphs, flat files, Google Sheets. - -- Programovatelné: Pomocí kódu můžete přizpůsobit extrakci, provádět agregace v čase transformace a modelovat výstup pro více zdrojů. - -- Přístup k dalším údajům, které nejsou k dispozici jako součást JSON RPC - -- Všechny výhody Firehose. - -## Co je Firehose? - -Firehose, vyvinutý společností [StreamingFast](https://www.streamingfast.io/), je vrstva pro extrakci dat z blockchainu, která byla od základu navržena tak, aby zpracovávala celou historii blockchainu dosud nevídanou rychlostí. Poskytuje přístup založený na souborech a streamování v první řadě a je klíčovou součástí sady open-source technologií StreamingFast a základem pro Substreams. - -Další informace o Firehose najdete v [dokumentaci](https://firehose.streamingfast.io/). - -## Jaké jsou výhody Firehose? - -Používání Firehose přináší mnoho výhod, včetně: - -- Nejnižší latence a žádné dotazování: Uzly Firehose jsou navrženy tak, aby se předháněly v odesílání blokových dat jako první. - -- Předchází výpadkům: Navrženo od základu pro vysokou dostupnost. - -- Nikdy nezmeškáte ani minutu: Proudový kurzor Firehose je navržen tak, aby si poradil s rozcestími a pokračoval tam, kde jste skončili, za jakýchkoli podmínek. - -- Nejbohatší datový model:  Nejlepší datový model, který zahrnuje změny zůstatku, celý strom volání, interní transakce, protokoly, změny v úložišti, náklady na plyn a další. - -- Využívá ploché soubory: Blockchain data jsou extrahována do plochých souborů, což je nejlevnější a nejoptimálnější dostupný výpočetní zdroj. - -## Where can developers access more information about Substreams-powered Subgraphs and Substreams? - -The [Substreams documentation](/substreams/introduction/) will teach you how to build Substreams modules. - -The [Substreams-powered Subgraphs documentation](/sps/introduction/) will show you how to package them for deployment on The Graph. - -The [latest Substreams Codegen tool](https://streamingfastio.medium.com/substreams-codegen-no-code-tool-to-bootstrap-your-project-a11efe0378c6) will allow you to bootstrap a Substreams project without any code. - -## Jaká je role modulů Rust v Substreamu? - -Rust modules are the equivalent of the AssemblyScript mappers in Subgraphs. They are compiled to WASM in a similar way, but the programming model allows for parallel execution. They define the sort of transformations and aggregations you want to apply to the raw blockchain data. - -See [modules documentation](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) for details. - -## Co dělá Substreamy složitelnými? - -Při použití substreamů probíhá kompozice na transformační vrstvě, což umožňuje opakované použití modulů uložených v mezipaměti. - -As an example, Alice can build a DEX price module, Bob can use it to build a volume aggregator for some tokens of his interest, and Lisa can combine four individual DEX price modules to create a price oracle. A single Substreams request will package all of these individual's modules, link them together, to offer a much more refined stream of data. That stream can then be used to populate a Subgraph, and be queried by consumers. - -## Jak můžete vytvořit a nasadit Substreams využívající podgraf? - -After [defining](/sps/introduction/) a Substreams-powered Subgraph, you can use the Graph CLI to deploy it in [Subgraph Studio](https://thegraph.com/studio/). - -## Where can I find examples of Substreams and Substreams-powered Subgraphs? - -You can visit [this Github repo](https://github.com/pinax-network/awesome-substreams) to find examples of Substreams and Substreams-powered Subgraphs. - -## What do Substreams and Substreams-powered Subgraphs mean for The Graph Network? - -Integrace slibuje mnoho výhod, včetně extrémně výkonného indexování a větší složitelnosti díky využití komunitních modulů a stavění na nich. diff --git a/website/src/pages/cs/substreams/sps/introduction.mdx b/website/src/pages/cs/substreams/sps/introduction.mdx deleted file mode 100644 index 4938d23102e4..000000000000 --- a/website/src/pages/cs/substreams/sps/introduction.mdx +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: Introduction to Substreams-Powered Subgraphs -sidebarTitle: Úvod ---- - -Boost your Subgraph's efficiency and scalability by using [Substreams](/substreams/introduction/) to stream pre-indexed blockchain data. - -## Přehled - -Use a Substreams package (`.spkg`) as a data source to give your Subgraph access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. - -### Specifics - -There are two methods of enabling this technology: - -1. **Using Substreams [triggers](/sps/triggers/)**: Consume from any Substreams module by importing the Protobuf model through a Subgraph handler and move all your logic into a Subgraph. This method creates the Subgraph entities directly in the Subgraph. - -2. **Using [Entity Changes](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out)**: By writing more of the logic into Substreams, you can consume the module's output directly into [graph-node](/indexing/tooling/graph-node/). In graph-node, you can use the Substreams data to create your Subgraph entities. - -You can choose where to place your logic, either in the Subgraph or Substreams. However, consider what aligns with your data needs, as Substreams has a parallelized model, and triggers are consumed linearly in the graph node. - -### Další zdroje - -Visit the following links for tutorials on using code-generation tooling to build your first end-to-end Substreams project quickly: - -- [Solana](/substreams/developing/solana/transactions/) -- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm) -- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet) -- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective) -- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra) -- [Stellar](https://docs.substreams.dev/tutorials/intro-to-tutorials/stellar) diff --git a/website/src/pages/cs/substreams/sps/triggers.mdx b/website/src/pages/cs/substreams/sps/triggers.mdx deleted file mode 100644 index b0c4bea23f3d..000000000000 --- a/website/src/pages/cs/substreams/sps/triggers.mdx +++ /dev/null @@ -1,47 +0,0 @@ ---- -title: Substreams Triggers ---- - -Use Custom Triggers and enable the full use GraphQL. - -## Přehled - -Custom Triggers allow you to send data directly into your Subgraph mappings file and entities, which are similar to tables and fields. This enables you to fully use the GraphQL layer. - -By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data in your Subgraph's handler. This ensures efficient and streamlined data management within the Subgraph framework. - -### Defining `handleTransactions` - -The following code demonstrates how to define a `handleTransactions` function in a Subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new Subgraph entity is created. - -```tsx -export function handleTransactions(bytes: Uint8Array): void { - let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).transactions // 1. - if (transactions.length == 0) { - log.info('No transactions found', []) - return - } - - for (let i = 0; i < transactions.length; i++) { - // 2. - let transaction = transactions[i] - - let entity = new Transaction(transaction.hash) // 3. - entity.from = transaction.from - entity.to = transaction.to - entity.save() - } -} -``` - -Here's what you're seeing in the `mappings.ts` file: - -1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object -2. Looping over the transactions -3. Create a new Subgraph entity for every transaction - -To go through a detailed example of a trigger-based Subgraph, [check out the tutorial](/sps/tutorial/). - -### Další zdroje - -To scaffold your first project in the Development Container, check out one of the [How-To Guide](/substreams/developing/dev-container/). diff --git a/website/src/pages/cs/substreams/sps/tutorial.mdx b/website/src/pages/cs/substreams/sps/tutorial.mdx deleted file mode 100644 index 67d564483af1..000000000000 --- a/website/src/pages/cs/substreams/sps/tutorial.mdx +++ /dev/null @@ -1,155 +0,0 @@ ---- -title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' -sidebarTitle: Tutorial ---- - -Successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. - -## Začněte - -For a video tutorial, check out [How to Index Solana with a Substreams-powered Subgraph](/sps/tutorial/#video-tutorial) - -### Prerequisites - -Before starting, make sure to: - -- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. -- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. - -### Step 1: Initialize Your Project - -1. Open your Dev Container and run the following command to initialize your project: - - ```bash - substreams init - ``` - -2. Select the "minimal" project option. - -3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: - -```yaml -specVersion: v0.1.0 -package: - name: my_project_sol - version: v0.1.0 - -imports: # Pass your spkg of interest - solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg - -modules: - - name: map_spl_transfers - use: solana:map_block # Select corresponding modules available within your spkg - initialBlock: 260000082 - - - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes - -network: solana-mainnet-beta - -params: # Modify the param fields to meet your needs - # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA - map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE -``` - -### Step 2: Generate the Subgraph Manifest - -Once the project is initialized, generate a Subgraph manifest by running the following command in the Dev Container: - -```bash -substreams codegen subgraph -``` - -You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: - -```yaml ---- -dataSources: - - kind: substreams - name: my_project_sol - network: solana-mainnet-beta - source: - package: - moduleName: map_spl_transfers # Module defined in the substreams.yaml - file: ./my-project-sol-v0.1.0.spkg - mapping: - apiVersion: 0.0.9 - kind: substreams/graph-entities - file: ./src/mappings.ts - handler: handleTriggers -``` - -### Step 3: Define Entities in `schema.graphql` - -Define the fields you want to save in your Subgraph entities by updating the `schema.graphql` file. - -Here is an example: - -```graphql -type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! -} -``` - -This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. - -### Step 4: Handle Substreams Data in `mappings.ts` - -With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. - -The example below demonstrates how to extract to Subgraph entities the non-derived transfers associated to the Orca account id: - -```ts -import { Protobuf } from 'as-proto/assembly' -import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' -import { MyTransfer } from '../generated/schema' - -export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) - - for (let i = 0; i < input.data.length; i++) { - const event = input.data[i] - - if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}` - const entity = new MyTransfer(entity_id) - entity.amount = event.transfer!.instruction!.amount.toString() - entity.source = event.transfer!.accounts!.source - entity.designation = event.transfer!.accounts!.destination - - if (event.transfer!.accounts!.signer!.single != null) { - entity.signers = [event.transfer!.accounts!.signer!.single!.signer] - } else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers - } - entity.save() - } - } -} -``` - -### Step 5: Generate Protobuf Files - -To generate Protobuf objects in AssemblyScript, run the following command: - -```bash -npm run protogen -``` - -This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the Subgraph's handler. - -### Závěr - -Congratulations! You've successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. You can take the next step by customizing your schema, mappings, and modules to fit your specific use case. - -### Video Tutorial - - - -### Další zdroje - -For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/src/pages/de/substreams/sps/_meta.js b/website/src/pages/de/substreams/sps/_meta.js deleted file mode 100644 index 86fcd3df5ec0..000000000000 --- a/website/src/pages/de/substreams/sps/_meta.js +++ /dev/null @@ -1,6 +0,0 @@ -export default { - introduction: '', - triggers: '', - tutorial: '', - faq: '', -} diff --git a/website/src/pages/de/substreams/sps/faq.mdx b/website/src/pages/de/substreams/sps/faq.mdx deleted file mode 100644 index 705188578529..000000000000 --- a/website/src/pages/de/substreams/sps/faq.mdx +++ /dev/null @@ -1,96 +0,0 @@ ---- -title: Substreams-basierte Subgraphen FAQ -sidebarTitle: FAQ ---- - -## Was sind Substreams? - -Substreams ist eine außergewöhnlich leistungsstarke Verarbeitungsmaschine, die umfangreiche Blockchain-Datenströme verarbeiten kann. Sie ermöglicht es Ihnen, Blockchain-Daten für eine schnelle und nahtlose Verarbeitung durch Endbenutzeranwendungen zu verfeinern und zu gestalten. - -Genauer gesagt handelt es sich um eine Blockchain-agnostische, parallelisierte und Streaming-first-Engine, die als Blockchain-Datenumwandlungsschicht dient. Sie wird von [Firehose](https://firehose.streamingfast.io/) angetrieben und ermöglicht es Entwicklern, Rust-Module zu schreiben, auf Community-Modulen aufzubauen, eine extrem leistungsstarke Indizierung bereitzustellen und ihre Daten überall [zu versenken](/substreams/developing/sinks/). - -Substreams wird von [StreamingFast](https://www.streamingfast.io/) entwickelt. Besuchen Sie die [Substreams-Dokumentation](/substreams/introduction/), um mehr über Substreams zu erfahren. - -## Was sind Substreams-basierte Subgraphen? - -Die [Substreams-basierte Subgraphen](/sps/introduction/) kombinieren die Leistungsfähigkeit von Substreams mit der Abfragefähigkeit von Subgraphen. Bei der Veröffentlichung eines Substreams-basierten Subgraphen können die von den Substreams-Transformationen erzeugten Daten [Entitätsänderungen ausgeben](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs), die mit Subgraph-Entitäten kompatibel sind. - -Wenn Sie bereits mit der Entwicklung von Subgraphen vertraut sind, dann beachten Sie, dass Substreams-basierte Subgraphen dann abgefragt werden können, als ob sie von der AssemblyScript-Transformationsschicht erzeugt worden wären, mit allen Vorteilen von Subgraphen, wie der Bereitstellung einer dynamischen und flexiblen GraphQL-API. - -## Wie unterscheiden sich Substreams-basierte Subgraphen von Subgraphen? - -Subgraphen bestehen aus Datenquellen, die Onchain-Ereignisse spezifizieren und angeben, wie diese Ereignisse über in Assemblyscript geschriebene Handler umgewandelt werden sollen. Diese Ereignisse werden nacheinander verarbeitet, basierend auf der Reihenfolge, in der die Ereignisse in der Kette auftreten. - -Im Gegensatz dazu haben Substreams-basierte Subgraphen eine einzige Datenquelle, die auf ein Substreams-Paket verweist, das vom Graph Node verarbeitet wird. Substreams haben im Vergleich zu herkömmlichen Subgraphen Zugriff auf zusätzliche granulare Onchain-Daten und können zudem von einer massiv parallelisierten Verarbeitung profitieren, was zu deutlich schnelleren Verarbeitungszeiten führen kann. - -## Was sind die Vorteile der Verwendung von Substreams-basierten Subgraphen? - -Substreams-basierte Subgraphen kombinieren alle Vorteile von Substreams mit der Abfragefähigkeit von Subgraphen. Sie bieten The Graph eine bessere Zusammensetzbarkeit und eine leistungsstarke Indizierung. Sie ermöglichen auch neue Datenanwendungsfälle; sobald Sie beispielsweise Ihren Substreams-basierten Subgraphen erstellt haben, können Sie Ihre [Substreams-Module] (https://docs.substreams.dev/reference-material/substreams-components/modules#modules) für die Ausgabe an verschiedene [Senken] (https://substreams.streamingfast.io/reference-and-specs/manifests#sink) wie PostgreSQL, MongoDB und Kafka wiederverwenden. - -## Was sind die Vorteile von Substreams? - -Die Verwendung von Substreams hat viele Vorteile, unter anderem: - -- Zusammensetzbar: Sie können Substreams-Module wie LEGO-Steine stapeln und auf Community-Modulen aufbauen, um öffentliche Daten weiter zu verfeinern. - -- Leistungsstarke Indexierung: Um Größenordnungen schnellere Indizierung durch groß angelegte Cluster paralleler Operationen (siehe BigQuery). - -- Versenken Sie überall: Versenken Sie Ihre Daten, wo immer Sie wollen: PostgreSQL, MongoDB, Kafka, Subgraphen, Flat Files, Google Sheets. - -- Programmierbar: Verwenden Sie Code, um die Extraktion anzupassen, Aggregationen zur Transformationszeit durchzuführen und Ihre Ausgabe für mehrere Sinken zu modellieren. - -- Zugang zu zusätzlichen Daten, die nicht als Teil des JSON RPC verfügbar sind - -- Alle Vorteile von Firehose. - -## Was ist überhaupt Firehose? - -Der von [StreamingFast] (https://www.streamingfast.io/) entwickelte Firehose ist eine Blockchain-Datenextraktionsschicht, die von Grund auf neu entwickelt wurde, um die gesamte Historie von Blockchains mit bisher nicht gekannter Geschwindigkeit zu verarbeiten. Sie bietet einen dateibasierten und streamingorientierten Ansatz und ist eine Kernkomponente der Open-Source-Technologien von StreamingFast und die Grundlage für Substreams. - -Besuchen Sie die [Dokumentation] (https://firehose.streamingfast.io/), um mehr über Firehose zu erfahren. - -## Was sind die Vorteile von Firehose? - -Die Verwendung von Firehose bietet viele Vorteile, darunter: - -- Geringste Latenz und kein Polling: Die Firehose-Knoten sind so konstruiert, dass sie die Blockdaten zuerst herausgeben, und zwar nach dem Streaming-Prinzip. - -- Verhindert Ausfallzeiten: Von Grund auf für Hochverfügbarkeit konzipiert. - -- Verpassen Sie nie einen Beat: Der Firehose Stream Cursor ist so konzipiert, dass er mit Forks umgehen kann und in jeder Situation dort weitermacht, wo Sie aufgehört haben. - -- Umfangreichstes Datenmodell: Bestes Datenmodell, das die Änderungen des Kontostands, den vollständigen Aufrufbaum, interne Transaktionen, Logs, Speicheränderungen, Gaskosten und mehr enthält. - -- Nutzung von Flat Files: Blockchain-Daten werden in Flat Files extrahiert, der billigsten und optimalsten verfügbaren Rechenressource. - -## Wo erhalten Entwickler weitere Informationen über Substreams-basierten Subgraphen und Substreams? - -In der [Substreams-Dokumentation](/substreams/introduction/) erfahren Sie, wie Sie Substreams-Module erstellen können. - -Die [Dokumentation zu Substreams-basierten Subgraphen](/sps/introduction/) zeigt Ihnen, wie Sie diese für die Bereitstellung in The Graph verpacken können. - -Das [neueste Substreams Codegen-Tool] (https://streamingfastio.medium.com/substreams-codegen-no-code-tool-to-bootstrap-your-project-a11efe0378c6) ermöglicht es Ihnen, ein Substreams-Projekt ohne jeglichen Code zu booten. - -## Welche Rolle spielen die Rust-Module in Substreams? - -Rust-Module sind das Äquivalent zu den AssemblyScript-Mappern in Subgraphen. Sie werden auf ähnliche Weise in WASM kompiliert, aber das Programmiermodell ermöglicht eine parallele Ausführung. Sie definieren die Art der Transformationen und Aggregationen, die Sie auf die Blockchain-Rohdaten anwenden möchten. - -Weitere Informationen finden Sie in der [Moduldokumentation](https://docs.substreams.dev/reference-material/substreams-components/modules#modules). - -## Was macht Substreams kompositionsfähig? - -Bei der Verwendung von Substreams erfolgt die Komposition auf der Transformationsschicht, wodurch zwischengespeicherte Module wiederverwendet werden können. - -Als Datenbeispiel kann Alice ein DEX-Preismodul erstellen, Bob kann damit einen Volumenaggregator für einige Token seines Interesses erstellen, und Lisa kann vier einzelne DEX-Preismodule zu einem Preisorakel kombinieren. Eine einzige Substreams-Anfrage bündelt all diese individuellen Module und verbindet sie miteinander, um einen viel feineren Datenstrom anzubieten. Dieser Datenstrom kann dann verwendet werden, um einen Subgraphen aufzufüllen und von den Verbrauchern abgefragt zu werden. - -## Wie können Sie einen Substreams-basierten Subgraphen erstellen und einsetzen? - -Nach der [Definition](/sps/introduction/) eines Subgraphen können Sie den Graph CLI verwenden, um ihn in [Subgraph Studio](https://thegraph.com/studio/) einzusetzen. - -## Wo finde ich Datenbeispiele für Substreams und Substreams-basierte Subgraphen? - -Sie können [dieses Github Repo] (https://github.com/pinax-network/awesome-substreams) besuchen, um Datenbeispiele für Substreams und Substreams-basierte Subgraphen zu finden. - -## Was bedeuten Substreams und Substreams-basierte Subgraphen für The Graph Network? - -Die Integration verspricht viele Vorteile, darunter eine extrem leistungsstarke Indizierung und eine größere Kompositionsfähigkeit durch die Nutzung von Community-Modulen und deren Weiterentwicklung. diff --git a/website/src/pages/de/substreams/sps/introduction.mdx b/website/src/pages/de/substreams/sps/introduction.mdx deleted file mode 100644 index 396c53077fd1..000000000000 --- a/website/src/pages/de/substreams/sps/introduction.mdx +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: Einführung in Substreams-Powered Subgraphen -sidebarTitle: Einführung ---- - -Steigern Sie die Effizienz und Skalierbarkeit Ihres Subgraphen, indem Sie [Substreams](/substreams/introduction/) verwenden, um vorindizierte Blockchain-Daten zu streamen. - -## Überblick - -Verwenden Sie ein Substreams-Paket (`.spkg`) als Datenquelle, um Ihrem Subgraph Zugang zu einem Strom von vorindizierten Blockchain-Daten zu geben. Dies ermöglicht eine effizientere und skalierbarere Datenverarbeitung, insbesondere bei großen oder komplexen Blockchain-Netzwerken. - -### Besonderheiten - -Es gibt zwei Methoden zur Aktivierung dieser Technologie: - -1. **Verwendung von Substreams [triggers](/sps/triggers/)**: Nutzen Sie ein beliebiges Substreams-Modul, indem Sie das Protobuf-Modell über einen Subgraph-Handler importieren und Ihre gesamte Logik in einen Subgraph verschieben. Diese Methode erstellt die Subgraph-Entitäten direkt im Subgraph. - -2. **Unter Verwendung von [Entity Changes](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out)**: Wenn Sie einen größeren Teil der Logik in Substreams schreiben, können Sie die Ausgabe des Moduls direkt in [graph-node](/indexing/tooling/graph-node/) verwenden. In graph-node können Sie die Substreams-Daten verwenden, um Ihre Subgraph-Entitäten zu erstellen. - -Sie können wählen, wo Sie Ihre Logik platzieren möchten, entweder im Subgraph oder in Substreams. Überlegen Sie jedoch, was mit Ihren Datenanforderungen übereinstimmt, da Substreams ein parallelisiertes Modell hat und Auslöser linear in den Graphknoten verbraucht werden. - -### Zusätzliche Ressourcen - -Unter den folgenden Links finden Sie Anleitungen zur Verwendung von Tools zur Codegenerierung, mit denen Sie schnell Ihr erstes durchgängiges Substreams-Projekt erstellen können: - -- [Solana](/substreams/developing/solana/transactions/) -- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm) -- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet) -- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective) -- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra) -- [Stellar](https://docs.substreams.dev/tutorials/intro-to-tutorials/stellar) diff --git a/website/src/pages/de/substreams/sps/triggers.mdx b/website/src/pages/de/substreams/sps/triggers.mdx deleted file mode 100644 index 792dee351596..000000000000 --- a/website/src/pages/de/substreams/sps/triggers.mdx +++ /dev/null @@ -1,47 +0,0 @@ ---- -title: Trigger für Substreams ---- - -Verwenden Sie Custom Triggers und aktivieren Sie die volle Nutzung von GraphQL. - -## Überblick - -Mit benutzerdefinierten Triggern können Sie Daten direkt in Ihre Subgraph-Mappings-Datei und Entitäten senden, die Tabellen und Feldern ähneln. So können Sie die GraphQL-Schicht vollständig nutzen. - -Durch den Import der Protobuf-Definitionen, die von Ihrem Substreams-Modul ausgegeben werden, können Sie diese Daten in Ihrem Subgraph-Handler empfangen und verarbeiten. Dies gewährleistet eine effiziente und schlanke Datenverwaltung innerhalb des Subgraph-Frameworks. - -### Definieren von `handleTransactions` - -Der folgende Code veranschaulicht, wie eine Funktion `handleTransactions` in einem Subgraph-Handler definiert wird. Diese Funktion empfängt rohe Substream-Bytes als Parameter und dekodiert sie in ein `Transactions`-Objekt. Für jede Transaktion wird eine neue Subgraph-Entität erstellt. - -```tsx -export function handleTransactions(bytes: Uint8Array): void { - let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).transactions // 1. - if (transactions.length == 0) { - log.info('No transactions found', []) - return - } - - for (let i = 0; i < transactions.length; i++) { - // 2. - let transaction = transactions[i] - - let entity = new Transaction(transaction.hash) // 3. - entity.from = transaction.from - entity.to = transaction.to - entity.save() - } -} -``` - -Das sehen Sie in der Datei `mappings.ts`: - -1. Die Bytes, die die Substreams enthalten, werden in das generierte `Transactions`-Objekt dekodiert. Dieses Objekt wird wie jedes andere AssemblyScript-Objekt verwendet -2. Looping über die Transaktionen -3. Erstellen einer neuen Subgraph-Entität für jede Transaktion - -Ein ausführliches Datenbeispiel für einen auslöserbasierten Subgraphen finden Sie [hier](/sps/tutorial/). - -### Zusätzliche Ressourcen - -Um Ihr erstes Projekt im Entwicklungscontainer zu erstellen, lesen Sie einen der [Schritt-für-Schritt-Guide](/substreams/developing/dev-container/). diff --git a/website/src/pages/de/substreams/sps/tutorial.mdx b/website/src/pages/de/substreams/sps/tutorial.mdx deleted file mode 100644 index db9bb0793890..000000000000 --- a/website/src/pages/de/substreams/sps/tutorial.mdx +++ /dev/null @@ -1,155 +0,0 @@ ---- -title: 'Tutorial: Einrichten eines Substreams-basierten Subgraphen auf Solana' -sidebarTitle: Tutorial ---- - -Erfolgreiche Einrichtung eines auslösungsbasierten Substreams-powered Subgraphs für ein Solana SPL-Token. - -## Los geht’s - -For a video tutorial, check out [How to Index Solana with a Substreams-powered Subgraph](/sps/tutorial/#video-tutorial) - -### Voraussetzungen - -Bevor Sie beginnen, stellen Sie Folgendes sicher: - -- Vervollständigen Sie das [Erste-Schritte-Guide] (https://github.com/streamingfast/substreams-starter), um Ihre Entwicklungsumgebung mit einem Dev Container einzurichten. -- Sie sollten mit The Graph und grundlegenden Blockchain-Konzepten wie Transaktionen und Protobufs vertraut sein. - -### Schritt 1: Initialisieren Sie Ihr Projekt - -1. Öffnen Sie Ihren Dev Container und führen Sie den folgenden Befehl aus, um Ihr Projekt zu initialisieren: - - ```bash - substreams init - ``` - -2. Wählen Sie die Option „Minimalprojekt“. - -3. Ersetzen Sie den Inhalt der generierten Datei `substreams.yaml` durch die folgende Konfiguration, die Transaktionen für das Orca-Konto nach der SPL-Token-Programm-ID filtert: - -```yaml -specVersion: v0.1.0 -Paket: - Name: my_project_sol - Version: v0.1.0 - -importiert: # Übergeben Sie Ihr spkg von Interesse - solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg - -Module: - - name: map_spl_transfers - use: solana:map_block # Wählen Sie die entsprechenden Module aus, die in Ihrem spkg verfügbar sind - initialBlock: 260000082 - - - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes - -netzwerk: solana-mainnet-beta - -params: # Passen Sie die param-Felder an Ihre Bedürfnisse an. - # Für program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA - map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE -``` - -### Schritt 2: Erzeugen des Subgraph-Manifests - -Sobald das Projekt initialisiert ist, erzeugen Sie ein Subgraph-Manifest, indem Sie den folgenden Befehl im Dev Container ausführen: - -```bash -substreams codegen subgraph -``` - -Sie erzeugen ein `subgraph.yaml`-Manifest, das das Substreams-Paket als Datenquelle importiert: - -```yaml ---- -dataSources: - - kind: substreams - name: my_project_sol - network: solana-mainnet-beta - source: - package: - moduleName: map_spl_transfers # Module defined in the substreams.yaml - file: ./my-project-sol-v0.1.0.spkg - mapping: - apiVersion: 0.0.9 - kind: substreams/graph-entities - file: ./src/mappings.ts - handler: handleTriggers -``` - -### Schritt 3: Definieren Sie Entitäten in `schema.graphql` - -Definieren Sie die Felder, die Sie in Ihren Subgraph-Entitäten speichern wollen, indem Sie die Datei `schema.graphql` aktualisieren. - -Hier ist ein Beispiel: - -```graphql -type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! -} -``` - -Dieses Schema definiert eine `MyTransfer`-Entität mit Feldern wie `id`, `amount`, `source`, `designation` und `signers`. - -### Schritt 4: Umgang mit Substreams Daten in `mappings.ts` - -Mit den erzeugten Protobuf-Objekten können Sie nun die dekodierten Substreams-Daten in Ihrer Datei `mappings.ts` im Verzeichnis `./src`verarbeiten. - -Das folgende Beispiel zeigt, wie die nicht abgeleiteten Überweisungen, die mit der Orca-Kontonummer verbunden sind, in die Subgraph-Entitäten extrahiert werden: - -```ts -import { Protobuf } from 'as-proto/assembly' -import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' -import { MyTransfer } from '../generated/schema' - -export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) - - for (let i = 0; i < input.data.length; i++) { - const event = input.data[i] - - if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}` - const entity = new MyTransfer(entity_id) - entity.amount = event.transfer!.instruction!.amount.toString() - entity.source = event.transfer!.accounts!.source - entity.designation = event.transfer!.accounts!.destination - - if (event.transfer!.accounts!.signer!.single != null) { - entity.signers = [event.transfer!.accounts!.signer!.single!.signer] - } else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers - } - entity.save() - } - } -} -``` - -### Schritt 5: Erzeugen von Protobuf-Dateien - -Um Protobuf-Objekte in AssemblyScript zu erzeugen, führen Sie den folgenden Befehl aus: - -```bash -npm run protogen -``` - -Dieser Befehl konvertiert die Protobuf-Definitionen in AssemblyScript, so dass Sie sie im Handler des Subgraphen verwenden können. - -### Schlussfolgerung - -Herzlichen Glückwunsch! Sie haben erfolgreich einen Trigger-basierten Substreams-powered Subgraph für ein Solana SPL-Token eingerichtet. Im nächsten Schritt können Sie Ihr Schema, Ihre Mappings und Module an Ihren spezifischen Anwendungsfall anpassen. - -### Video-Anleitung - - - -### Zusätzliche Ressourcen - -Für weitergehende Anpassungen und Optimierungen lesen Sie bitte die offizielle [Substreams-Dokumentation] (https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/src/pages/en/substreams/_meta-titles.json b/website/src/pages/en/substreams/_meta-titles.json index b8799cc89251..6262ad528c3a 100644 --- a/website/src/pages/en/substreams/_meta-titles.json +++ b/website/src/pages/en/substreams/_meta-titles.json @@ -1,4 +1,3 @@ { - "developing": "Developing", - "sps": "Substreams-powered Subgraphs" + "developing": "Developing" } diff --git a/website/src/pages/en/substreams/_meta.js b/website/src/pages/en/substreams/_meta.js index a8ee04618eae..658bb8c213c6 100644 --- a/website/src/pages/en/substreams/_meta.js +++ b/website/src/pages/en/substreams/_meta.js @@ -5,5 +5,4 @@ export default { introduction: '', developing: titles.developing ?? '', publishing: '', - sps: titles.sps ?? '', } diff --git a/website/src/pages/en/substreams/sps/_meta.js b/website/src/pages/en/substreams/sps/_meta.js deleted file mode 100644 index 86fcd3df5ec0..000000000000 --- a/website/src/pages/en/substreams/sps/_meta.js +++ /dev/null @@ -1,6 +0,0 @@ -export default { - introduction: '', - triggers: '', - tutorial: '', - faq: '', -} diff --git a/website/src/pages/en/substreams/sps/faq.mdx b/website/src/pages/en/substreams/sps/faq.mdx deleted file mode 100644 index 250c466d5929..000000000000 --- a/website/src/pages/en/substreams/sps/faq.mdx +++ /dev/null @@ -1,96 +0,0 @@ ---- -title: Substreams-Powered Subgraphs FAQ -sidebarTitle: FAQ ---- - -## What are Substreams? - -Substreams is an exceptionally powerful processing engine capable of consuming rich streams of blockchain data. It allows you to refine and shape blockchain data for fast and seamless digestion by end-user applications. - -Specifically, it's a blockchain-agnostic, parallelized, and streaming-first engine that serves as a blockchain data transformation layer. It's powered by [Firehose](https://firehose.streamingfast.io/), and enables developers to write Rust modules, build upon community modules, provide extremely high-performance indexing, and [sink](/substreams/developing/sinks/) their data anywhere. - -Substreams is developed by [StreamingFast](https://www.streamingfast.io/). Visit the [Substreams Documentation](/substreams/introduction/) to learn more about Substreams. - -## What are Substreams-powered Subgraphs? - -[Substreams-powered Subgraphs](/sps/introduction/) combine the power of Substreams with the queryability of Subgraphs. When publishing a Substreams-powered Subgraph, the data produced by the Substreams transformations, can [output entity changes](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs) that are compatible with Subgraph entities. - -If you are already familiar with Subgraph development, note that Substreams-powered Subgraphs can then be queried just as if it had been produced by the AssemblyScript transformation layer. This provides all the benefits of Subgraphs, including a dynamic and flexible GraphQL API. - -## How are Substreams-powered Subgraphs different from Subgraphs? - -Subgraphs are made up of datasources which specify onchain events, and how those events should be transformed via handlers written in Assemblyscript. These events are processed sequentially, based on the order in which events happen onchain. - -By contrast, substreams-powered Subgraphs have a single datasource which references a substreams package, which is processed by the Graph Node. Substreams have access to additional granular onchain data compared to conventional Subgraphs, and can also benefit from massively parallelised processing, which can mean much faster processing times. - -## What are the benefits of using Substreams-powered Subgraphs? - -Substreams-powered Subgraphs combine all the benefits of Substreams with the queryability of Subgraphs. They bring greater composability and high-performance indexing to The Graph. They also enable new data use cases; for example, once you've built your Substreams-powered Subgraph, you can reuse your [Substreams modules](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) to output to different [sinks](https://substreams.streamingfast.io/reference-and-specs/manifests#sink) such as PostgreSQL, MongoDB, and Kafka. - -## What are the benefits of Substreams? - -There are many benefits to using Substreams, including: - -- Composable: You can stack Substreams modules like LEGO blocks, and build upon community modules, further refining public data. - -- High-performance indexing: Orders of magnitude faster indexing through large-scale clusters of parallel operations (think BigQuery). - -- Sink anywhere: Sink your data to anywhere you want: PostgreSQL, MongoDB, Kafka, Subgraphs, flat files, Google Sheets. - -- Programmable: Use code to customize extraction, do transformation-time aggregations, and model your output for multiple sinks. - -- Access to additional data which is not available as part of the JSON RPC - -- All the benefits of the Firehose. - -## What is the Firehose? - -Developed by [StreamingFast](https://www.streamingfast.io/), the Firehose is a blockchain data extraction layer designed from scratch to process the full history of blockchains at speeds that were previously unseen. Providing a files-based and streaming-first approach, it is a core component of StreamingFast's suite of open-source technologies and the foundation for Substreams. - -Go to the [documentation](https://firehose.streamingfast.io/) to learn more about the Firehose. - -## What are the benefits of the Firehose? - -There are many benefits to using Firehose, including: - -- Lowest latency & no polling: In a streaming-first fashion, the Firehose nodes are designed to race to push out the block data first. - -- Prevents downtimes: Designed from the ground up for High Availability. - -- Never miss a beat: The Firehose stream cursor is designed to handle forks and to continue where you left off in any condition. - -- Richest data model:  Best data model that includes the balance changes, the full call tree, internal transactions, logs, storage changes, gas costs, and more. - -- Leverages flat files: Blockchain data is extracted into flat files, the cheapest and most optimized computing resource available. - -## Where can developers access more information about Substreams-powered Subgraphs and Substreams? - -The [Substreams documentation](/substreams/introduction/) will teach you how to build Substreams modules. - -The [Substreams-powered Subgraphs documentation](/sps/introduction/) will show you how to package them for deployment on The Graph. - -The [latest Substreams Codegen tool](https://streamingfastio.medium.com/substreams-codegen-no-code-tool-to-bootstrap-your-project-a11efe0378c6) will allow you to bootstrap a Substreams project without any code. - -## What is the role of Rust modules in Substreams? - -Rust modules are the equivalent of the AssemblyScript mappers in Subgraphs. They are compiled to WASM in a similar way, but the programming model allows for parallel execution. They define the sort of transformations and aggregations you want to apply to the raw blockchain data. - -See [modules documentation](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) for details. - -## What makes Substreams composable? - -When using Substreams, the composition happens at the transformation layer enabling cached modules to be re-used. - -As an example, Alice can build a DEX price module, Bob can use it to build a volume aggregator for some tokens of his interest, and Lisa can combine four individual DEX price modules to create a price oracle. A single Substreams request will package all of these individual's modules, link them together, to offer a much more refined stream of data. That stream can then be used to populate a Subgraph, and be queried by consumers. - -## How can you build and deploy a Substreams-powered Subgraph? - -After [defining](/sps/introduction/) a Substreams-powered Subgraph, you can use the Graph CLI to deploy it in [Subgraph Studio](https://thegraph.com/studio/). - -## Where can I find examples of Substreams and Substreams-powered Subgraphs? - -You can visit [this Github repo](https://github.com/pinax-network/awesome-substreams) to find examples of Substreams and Substreams-powered Subgraphs. - -## What do Substreams and Substreams-powered Subgraphs mean for The Graph Network? - -The integration promises many benefits, including extremely high-performance indexing and greater composability by leveraging community modules and building on them. diff --git a/website/src/pages/en/substreams/sps/introduction.mdx b/website/src/pages/en/substreams/sps/introduction.mdx deleted file mode 100644 index 92d8618165dd..000000000000 --- a/website/src/pages/en/substreams/sps/introduction.mdx +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: Introduction to Substreams-Powered Subgraphs -sidebarTitle: Introduction ---- - -Boost your Subgraph's efficiency and scalability by using [Substreams](/substreams/introduction/) to stream pre-indexed blockchain data. - -## Overview - -Use a Substreams package (`.spkg`) as a data source to give your Subgraph access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. - -### Specifics - -There are two methods of enabling this technology: - -1. **Using Substreams [triggers](/sps/triggers/)**: Consume from any Substreams module by importing the Protobuf model through a Subgraph handler and move all your logic into a Subgraph. This method creates the Subgraph entities directly in the Subgraph. - -2. **Using [Entity Changes](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out)**: By writing more of the logic into Substreams, you can consume the module's output directly into [graph-node](/indexing/tooling/graph-node/). In graph-node, you can use the Substreams data to create your Subgraph entities. - -You can choose where to place your logic, either in the Subgraph or Substreams. However, consider what aligns with your data needs, as Substreams has a parallelized model, and triggers are consumed linearly in the graph node. - -### Additional Resources - -Visit the following links for tutorials on using code-generation tooling to build your first end-to-end Substreams project quickly: - -- [Solana](/substreams/developing/solana/transactions/) -- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm) -- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet) -- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective) -- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra) -- [Stellar](https://docs.substreams.dev/tutorials/intro-to-tutorials/stellar) diff --git a/website/src/pages/en/substreams/sps/triggers.mdx b/website/src/pages/en/substreams/sps/triggers.mdx deleted file mode 100644 index 66687aa21889..000000000000 --- a/website/src/pages/en/substreams/sps/triggers.mdx +++ /dev/null @@ -1,47 +0,0 @@ ---- -title: Substreams Triggers ---- - -Use Custom Triggers and enable the full use GraphQL. - -## Overview - -Custom Triggers allow you to send data directly into your Subgraph mappings file and entities, which are similar to tables and fields. This enables you to fully use the GraphQL layer. - -By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data in your Subgraph's handler. This ensures efficient and streamlined data management within the Subgraph framework. - -### Defining `handleTransactions` - -The following code demonstrates how to define a `handleTransactions` function in a Subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new Subgraph entity is created. - -```tsx -export function handleTransactions(bytes: Uint8Array): void { - let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).transactions // 1. - if (transactions.length == 0) { - log.info('No transactions found', []) - return - } - - for (let i = 0; i < transactions.length; i++) { - // 2. - let transaction = transactions[i] - - let entity = new Transaction(transaction.hash) // 3. - entity.from = transaction.from - entity.to = transaction.to - entity.save() - } -} -``` - -Here's what you're seeing in the `mappings.ts` file: - -1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object -2. Looping over the transactions -3. Create a new Subgraph entity for every transaction - -To go through a detailed example of a trigger-based Subgraph, [check out the tutorial](/sps/tutorial/). - -### Additional Resources - -To scaffold your first project in the Development Container, check out one of the [How-To Guide](/substreams/developing/dev-container/). diff --git a/website/src/pages/en/substreams/sps/tutorial.mdx b/website/src/pages/en/substreams/sps/tutorial.mdx deleted file mode 100644 index 4113baa43ae6..000000000000 --- a/website/src/pages/en/substreams/sps/tutorial.mdx +++ /dev/null @@ -1,154 +0,0 @@ ---- -title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' -sidebarTitle: Tutorial ---- - -Successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. - -## Get Started - -For a video tutorial, check out [How to Index Solana with a Substreams-powered Subgraph](/sps/tutorial/#video-tutorial) - -### Prerequisites - -Before starting, make sure to: - -- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. -- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. - -### Step 1: Initialize Your Project - -1. Open your Dev Container and run the following command to initialize your project: - - ```bash - substreams init - ``` - -2. Select the "minimal" project option. -3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: - -```yaml -specVersion: v0.1.0 -package: - name: my_project_sol - version: v0.1.0 - -imports: # Pass your spkg of interest - solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg - -modules: - - name: map_spl_transfers - use: solana:map_block # Select corresponding modules available within your spkg - initialBlock: 260000082 - - - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes - -network: solana-mainnet-beta - -params: # Modify the param fields to meet your needs - # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA - map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE -``` - -### Step 2: Generate the Subgraph Manifest - -Once the project is initialized, generate a Subgraph manifest by running the following command in the Dev Container: - -```bash -substreams codegen subgraph -``` - -You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: - -```yaml ---- -dataSources: - - kind: substreams - name: my_project_sol - network: solana-mainnet-beta - source: - package: - moduleName: map_spl_transfers # Module defined in the substreams.yaml - file: ./my-project-sol-v0.1.0.spkg - mapping: - apiVersion: 0.0.9 - kind: substreams/graph-entities - file: ./src/mappings.ts - handler: handleTriggers -``` - -### Step 3: Define Entities in `schema.graphql` - -Define the fields you want to save in your Subgraph entities by updating the `schema.graphql` file. - -Here is an example: - -```graphql -type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! -} -``` - -This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. - -### Step 4: Handle Substreams Data in `mappings.ts` - -With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. - -The example below demonstrates how to extract to Subgraph entities the non-derived transfers associated to the Orca account id: - -```ts -import { Protobuf } from 'as-proto/assembly' -import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' -import { MyTransfer } from '../generated/schema' - -export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) - - for (let i = 0; i < input.data.length; i++) { - const event = input.data[i] - - if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}` - const entity = new MyTransfer(entity_id) - entity.amount = event.transfer!.instruction!.amount.toString() - entity.source = event.transfer!.accounts!.source - entity.designation = event.transfer!.accounts!.destination - - if (event.transfer!.accounts!.signer!.single != null) { - entity.signers = [event.transfer!.accounts!.signer!.single!.signer] - } else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers - } - entity.save() - } - } -} -``` - -### Step 5: Generate Protobuf Files - -To generate Protobuf objects in AssemblyScript, run the following command: - -```bash -npm run protogen -``` - -This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the Subgraph's handler. - -### Conclusion - -Congratulations! You've successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. You can take the next step by customizing your schema, mappings, and modules to fit your specific use case. - -### Video Tutorial - - - -### Additional Resources - -For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/src/pages/es/substreams/sps/_meta.js b/website/src/pages/es/substreams/sps/_meta.js deleted file mode 100644 index 86fcd3df5ec0..000000000000 --- a/website/src/pages/es/substreams/sps/_meta.js +++ /dev/null @@ -1,6 +0,0 @@ -export default { - introduction: '', - triggers: '', - tutorial: '', - faq: '', -} diff --git a/website/src/pages/es/substreams/sps/faq.mdx b/website/src/pages/es/substreams/sps/faq.mdx deleted file mode 100644 index dd7685e1a4be..000000000000 --- a/website/src/pages/es/substreams/sps/faq.mdx +++ /dev/null @@ -1,96 +0,0 @@ ---- -title: Preguntas frecuentes sobre Subgrafos impulsados por Substreams - FAQ -sidebarTitle: FAQ ---- - -## ¿Qué son los Substreams? - -Substreams is an exceptionally powerful processing engine capable of consuming rich streams of blockchain data. It allows you to refine and shape blockchain data for fast and seamless digestion by end-user applications. - -Specifically, it's a blockchain-agnostic, parallelized, and streaming-first engine that serves as a blockchain data transformation layer. It's powered by [Firehose](https://firehose.streamingfast.io/), and enables developers to write Rust modules, build upon community modules, provide extremely high-performance indexing, and [sink](/substreams/developing/sinks/) their data anywhere. - -Substreams is developed by [StreamingFast](https://www.streamingfast.io/). Visit the [Substreams Documentation](/substreams/introduction/) to learn more about Substreams. - -## What are Substreams-powered Subgraphs? - -[Substreams-powered Subgraphs](/sps/introduction/) combine the power of Substreams with the queryability of Subgraphs. When publishing a Substreams-powered Subgraph, the data produced by the Substreams transformations, can [output entity changes](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs) that are compatible with Subgraph entities. - -If you are already familiar with Subgraph development, note that Substreams-powered Subgraphs can then be queried just as if it had been produced by the AssemblyScript transformation layer. This provides all the benefits of Subgraphs, including a dynamic and flexible GraphQL API. - -## How are Substreams-powered Subgraphs different from Subgraphs? - -Los subgrafos están compuestos por fuentes de datos que especifican eventos en la cadena de bloques, y cómo esos eventos deben ser transformados mediante controladores escritos en AssemblyScript. Estos eventos se procesan de manera secuencial, según el orden en el que ocurren los eventos onchain. - -By contrast, substreams-powered Subgraphs have a single datasource which references a substreams package, which is processed by the Graph Node. Substreams have access to additional granular onchain data compared to conventional Subgraphs, and can also benefit from massively parallelised processing, which can mean much faster processing times. - -## What are the benefits of using Substreams-powered Subgraphs? - -Substreams-powered Subgraphs combine all the benefits of Substreams with the queryability of Subgraphs. They bring greater composability and high-performance indexing to The Graph. They also enable new data use cases; for example, once you've built your Substreams-powered Subgraph, you can reuse your [Substreams modules](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) to output to different [sinks](https://substreams.streamingfast.io/reference-and-specs/manifests#sink) such as PostgreSQL, MongoDB, and Kafka. - -## ¿Cuáles son los beneficios de Substreams? - -Hay muchos beneficios al usar Substreams, incluyendo: - -- Componible: Puedes apilar módulos de Substreams como bloques de LEGO y construir sobre módulos de la comunidad, refinando aún más los datos públicos. - -- Indexación de alto rendimiento: Indexación mucho más rápida mediante grandes clústeres de operaciones en paralelo (piensa en BigQuery). - -- Sink anywhere: Sink your data to anywhere you want: PostgreSQL, MongoDB, Kafka, Subgraphs, flat files, Google Sheets. - -- Programable: Usa código para personalizar la extracción, realizar agregaciones en tiempo de transformación y modelar tu salida para múltiples destinos. - -- Acceso a datos adicionales que no están disponibles como parte del JSON RPC - -- Todos los beneficios de Firehose. - -## ¿Qué es el Firehose? - -Desarrollado por StreamingFast (https://www.streamingfast.io/), Firehose es una capa de extracción de datos blockchain diseñada desde cero para procesar el historial completo de las blockchains a velocidades nunca antes vistas. Proporcionando un enfoque basado en archivos y priorizando la transmisión de datos, es un componente clave del conjunto de tecnologías de código abierto de StreamingFast y la base de Substreams. - -Visita la [documentación] (https://firehose.streamingfast.io/) para obtener más información sobre Firehose. - -## ¿Cuáles son los beneficios de Firehose? - -There are many benefits to using Firehose, including: - -- Lowest latency & no polling: In a streaming-first fashion, the Firehose nodes are designed to race to push out the block data first. - -- Prevents downtimes: Designed from the ground up for High Availability. - -- Never miss a beat: The Firehose stream cursor is designed to handle forks and to continue where you left off in any condition. - -- Richest data model:  Best data model that includes the balance changes, the full call tree, internal transactions, logs, storage changes, gas costs, and more. - -- Leverages flat files: Blockchain data is extracted into flat files, the cheapest and most optimized computing resource available. - -## Where can developers access more information about Substreams-powered Subgraphs and Substreams? - -La [documentación de Substreams] (/substreams/introduction/) te enseñará cómo construir módulos de Substreams. - -The [Substreams-powered Subgraphs documentation](/sps/introduction/) will show you how to package them for deployment on The Graph. - -La [última herramienta de Substreams Codegen] (https://streamingfastio.medium.com/substreams-codegen-no-code-tool-to-bootstrap-your-project-a11efe0378c6) te permitirá iniciar un proyecto de Substreams sin necesidad de escribir código. - -## ¿Cuál es el papel de los módulos de Rust en Substreams? - -Rust modules are the equivalent of the AssemblyScript mappers in Subgraphs. They are compiled to WASM in a similar way, but the programming model allows for parallel execution. They define the sort of transformations and aggregations you want to apply to the raw blockchain data. - -See [modules documentation](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) for details. - -## ¿Qué hace que Substreams sea componible? - -Cuando se usa Substreams, la composición ocurre en la capa de transformación, lo que permite que los módulos en caché sean reutilizados. - -As an example, Alice can build a DEX price module, Bob can use it to build a volume aggregator for some tokens of his interest, and Lisa can combine four individual DEX price modules to create a price oracle. A single Substreams request will package all of these individual's modules, link them together, to offer a much more refined stream of data. That stream can then be used to populate a Subgraph, and be queried by consumers. - -## How can you build and deploy a Substreams-powered Subgraph? - -After [defining](/sps/introduction/) a Substreams-powered Subgraph, you can use the Graph CLI to deploy it in [Subgraph Studio](https://thegraph.com/studio/). - -## Where can I find examples of Substreams and Substreams-powered Subgraphs? - -You can visit [this Github repo](https://github.com/pinax-network/awesome-substreams) to find examples of Substreams and Substreams-powered Subgraphs. - -## What do Substreams and Substreams-powered Subgraphs mean for The Graph Network? - -The integration promises many benefits, including extremely high-performance indexing and greater composability by leveraging community modules and building on them. diff --git a/website/src/pages/es/substreams/sps/introduction.mdx b/website/src/pages/es/substreams/sps/introduction.mdx deleted file mode 100644 index 4340733cfc84..000000000000 --- a/website/src/pages/es/substreams/sps/introduction.mdx +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: Introducción a los Subgrafos Impulsados por Substreams -sidebarTitle: Introducción ---- - -Boost your Subgraph's efficiency and scalability by using [Substreams](/substreams/introduction/) to stream pre-indexed blockchain data. - -## Descripción - -Use a Substreams package (`.spkg`) as a data source to give your Subgraph access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. - -### Specifics - -Existen dos métodos para habilitar esta tecnología: - -1. **Using Substreams [triggers](/sps/triggers/)**: Consume from any Substreams module by importing the Protobuf model through a Subgraph handler and move all your logic into a Subgraph. This method creates the Subgraph entities directly in the Subgraph. - -2. **Using [Entity Changes](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out)**: By writing more of the logic into Substreams, you can consume the module's output directly into [graph-node](/indexing/tooling/graph-node/). In graph-node, you can use the Substreams data to create your Subgraph entities. - -You can choose where to place your logic, either in the Subgraph or Substreams. However, consider what aligns with your data needs, as Substreams has a parallelized model, and triggers are consumed linearly in the graph node. - -### Recursos Adicionales - -Visit the following links for tutorials on using code-generation tooling to build your first end-to-end Substreams project quickly: - -- [Solana](/substreams/developing/solana/transactions/) -- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm) -- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet) -- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective) -- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra) -- [Stellar](https://docs.substreams.dev/tutorials/intro-to-tutorials/stellar) diff --git a/website/src/pages/es/substreams/sps/triggers.mdx b/website/src/pages/es/substreams/sps/triggers.mdx deleted file mode 100644 index 16db4057a732..000000000000 --- a/website/src/pages/es/substreams/sps/triggers.mdx +++ /dev/null @@ -1,47 +0,0 @@ ---- -title: Disparadores de Substreams ---- - -Use Custom Triggers and enable the full use GraphQL. - -## Descripción - -Custom Triggers allow you to send data directly into your Subgraph mappings file and entities, which are similar to tables and fields. This enables you to fully use the GraphQL layer. - -By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data in your Subgraph's handler. This ensures efficient and streamlined data management within the Subgraph framework. - -### Defining `handleTransactions` - -The following code demonstrates how to define a `handleTransactions` function in a Subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new Subgraph entity is created. - -```tsx -export function handleTransactions(bytes: Uint8Array): void { - let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).transactions // 1. - if (transactions.length == 0) { - log.info('No transactions found', []) - return - } - - for (let i = 0; i < transactions.length; i++) { - // 2. - let transaction = transactions[i] - - let entity = new Transaction(transaction.hash) // 3. - entity.from = transaction.from - entity.to = transaction.to - entity.save() - } -} -``` - -Here's what you're seeing in the `mappings.ts` file: - -1. Los bytes que contienen los datos de Substreams se decodifican en el objeto 'Transactions' generado, y este objeto se utiliza como cualquier otro objeto de AssemblyScript. -2. Iterando sobre las transacciones -3. Create a new Subgraph entity for every transaction - -To go through a detailed example of a trigger-based Subgraph, [check out the tutorial](/sps/tutorial/). - -### Recursos Adicionales - -To scaffold your first project in the Development Container, check out one of the [How-To Guide](/substreams/developing/dev-container/). diff --git a/website/src/pages/es/substreams/sps/tutorial.mdx b/website/src/pages/es/substreams/sps/tutorial.mdx deleted file mode 100644 index 52ebe46d1753..000000000000 --- a/website/src/pages/es/substreams/sps/tutorial.mdx +++ /dev/null @@ -1,155 +0,0 @@ ---- -title: 'Tutorial: Configurar un Subgrafo Potenciado por Substreams en Solana' -sidebarTitle: Tutorial ---- - -Successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. - -## Comenzar - -For a video tutorial, check out [How to Index Solana with a Substreams-powered Subgraph](/sps/tutorial/#video-tutorial) - -### Prerequisites - -Antes de comenzar, asegúrate de: - -- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. -- Estar familiarizado con The Graph y conceptos básicos de blockchain, como transacciones y Protobufs. - -### Paso 1: Inicializa tu proyecto - -1. Abre tu Dev Container y ejecuta el siguiente comando para inicializar tu proyecto: - - ```bash - substreams init - ``` - -2. Selecciona la opción de proyecto "mínimo". - -3. Reemplaza el contenido del archivo 'substreams.yam'l generado con la siguiente configuración, que filtra las transacciones para la cuenta de Orca en el ID del programa SPL token: - -```yaml -specVersion: v0.1.0 -package: - name: my_project_sol - version: v0.1.0 - -imports: # Pass your spkg of interest - solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg - -modules: - - name: map_spl_transfers - use: solana:map_block # Select corresponding modules available within your spkg - initialBlock: 260000082 - - - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes - -network: solana-mainnet-beta - -params: # Modify the param fields to meet your needs - # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA - map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE -``` - -### Paso 2: Generar el Manifiesto del Subgrafo - -Once the project is initialized, generate a Subgraph manifest by running the following command in the Dev Container: - -```bash -substreams codegen subgrafo -``` - -Generarás un manifiesto subgraph.yaml que importa el paquete de Substreams como una fuente de datos: - -```yaml ---- -dataSources: - - kind: substreams - name: my_project_sol - network: solana-mainnet-beta - source: - package: - moduleName: map_spl_transfers # Module defined in the substreams.yaml - file: ./my-project-sol-v0.1.0.spkg - mapping: - apiVersion: 0.0.9 - kind: substreams/graph-entities - file: ./src/mappings.ts - handler: handleTriggers -``` - -### Paso 3: Definir Entidades en schema.graphql - -Define the fields you want to save in your Subgraph entities by updating the `schema.graphql` file. - -Here is an example: - -```graphql -type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! -} -``` - -Este esquema define una entidad 'MyTransfer' con campos como 'id', 'amount', 'source', 'designation' y 'signers'. - -### Paso 4: Manejar los datos de Substreams en 'mappings.ts' - -With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. - -The example below demonstrates how to extract to Subgraph entities the non-derived transfers associated to the Orca account id: - -```ts -import { Protobuf } from 'as-proto/assembly' -import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' -import { MyTransfer } from '../generated/schema' - -export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) - - for (let i = 0; i < input.data.length; i++) { - const event = input.data[i] - - if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}` - const entity = new MyTransfer(entity_id) - entity.amount = event.transfer!.instruction!.amount.toString() - entity.source = event.transfer!.accounts!.source - entity.designation = event.transfer!.accounts!.destination - - if (event.transfer!.accounts!.signer!.single != null) { - entity.signers = [event.transfer!.accounts!.signer!.single!.signer] - } else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers - } - entity.save() - } - } -} -``` - -### Paso 5: Generar Archivos Protobuf - -Para generar objetos Protobuf en AssemblyScript, ejecuta el siguiente comando: - -```bash -npm run protogen -``` - -This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the Subgraph's handler. - -### Conclusion - -Congratulations! You've successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. You can take the next step by customizing your schema, mappings, and modules to fit your specific use case. - -### Video Tutorial - - - -### Recursos Adicionales - -Para una personalización y optimización más avanzada, consulta la [documentación oficial de Substreams] (https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/src/pages/fr/substreams/sps/_meta.js b/website/src/pages/fr/substreams/sps/_meta.js deleted file mode 100644 index 86fcd3df5ec0..000000000000 --- a/website/src/pages/fr/substreams/sps/_meta.js +++ /dev/null @@ -1,6 +0,0 @@ -export default { - introduction: '', - triggers: '', - tutorial: '', - faq: '', -} diff --git a/website/src/pages/fr/substreams/sps/faq.mdx b/website/src/pages/fr/substreams/sps/faq.mdx deleted file mode 100644 index 9519360ba265..000000000000 --- a/website/src/pages/fr/substreams/sps/faq.mdx +++ /dev/null @@ -1,96 +0,0 @@ ---- -title: FAQ sur les Subgraphs alimentés par Substreams -sidebarTitle: FAQ ---- - -## Que sont les sous-flux ? - -Substreams est un moteur de traitement exceptionnellement puissant capable de consommer de riches flux de données blockchain. Il vous permet d'affiner et de façonner les données de la blockchain pour une digestion rapide et transparente par les applications des utilisateurs finaux. - -Specifically, it's a blockchain-agnostic, parallelized, and streaming-first engine that serves as a blockchain data transformation layer. It's powered by [Firehose](https://firehose.streamingfast.io/), and enables developers to write Rust modules, build upon community modules, provide extremely high-performance indexing, and [sink](/substreams/developing/sinks/) their data anywhere. - -Substreams est développé par [StreamingFast](https://www.streamingfast.io/). Visitez la [Documentation Substreams](/substreams/introduction/) pour en savoir plus sur Substreams. - -## Qu'est-ce qu'un subgraph alimenté par Substreams ? - -Les [subgraphs alimentés par Substreams](/sps/introduction/) combinent la puissance de Substreams avec la capacité d'interrogation des subgraphs. Lors de la publication d'un subgraph alimenté par Substreams, les données produites par les transformations Substreams peuvent [produire des changements d'entité](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs) compatibles avec les entités du subgraph. - -Si vous êtes déjà familiarisé avec le développement de subgraphs, notez que les subgraphs alimentés par Substreams peuvent être interrogés comme s'ils avaient été produits par la couche de transformation AssemblyScript. Cela permet de bénéficier de tous les avantages des subgraphs, y compris d'une API GraphQL dynamique et flexible. - -## En quoi les Subgraphs alimentés par Substreams se distinguent-ils des Subgraphs ? - -Les subgraphs sont constitués de sources de données qui spécifient des événements onchain et comment ces événements doivent être transformés via des gestionnaires écrits en Assemblyscript. Ces événements sont traités de manière séquentielle, en fonction de l'ordre dans lequel ils se produisent onchain. - -En revanche, les subgraphs alimentés par Substreams ont une source de données unique qui fait référence à un package substream, qui est traité par le Graph Node. Les subgraphs ont accès à des données granulaires supplémentaires onchain par rapport aux subgraphs conventionnels et peuvent également bénéficier d'un traitement massivement parallélisé, ce qui peut se traduire par des temps de traitement beaucoup plus rapides. - -## Quels sont les avantages de l'utilisation des subgraphs alimentés par Substreams ? - -Les subgraphs alimentés par Substreams combinent tous les avantages de Substreams avec la capacité d'interrogation des subgraphs. Ils apportent à The Graph une plus grande composabilité et une indexation très performante. Ils permettent également de nouveaux cas d'utilisation des données ; par exemple, une fois que vous avez construit votre subgraph alimenté par Substreams, vous pouvez réutiliser vos [modules Substreams](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) pour sortir vers différents [sinks](https://substreams.streamingfast.io/reference-and-specs/manifests#sink) tels que PostgreSQL, MongoDB et Kafka. - -## Quels sont les avantages de Substreams ? - -L'utilisation de Substreams présente de nombreux avantages, notamment: - -- Composable : Vous pouvez empiler les modules Substreams comme des blocs LEGO et construire des modules communautaires pour affiner les données publiques. - -- Indexation haute performance : Indexation plus rapide d'un ordre de grandeur grâce à des grappes d'opérations parallèles à grande échelle (comme BigQuery). - -- "Sinkez" n'importe où : "Sinkez" vos données où vous le souhaitez : PostgreSQL, MongoDB, Kafka, Subgraphs, fichiers plats, Google Sheets. - -- Programmable : Utilisez du code pour personnaliser l'extraction, effectuer des agrégations au moment de la transformation et modéliser vos résultats pour plusieurs puits. - -- Accès à des données supplémentaires qui ne sont pas disponibles dans le cadre de la RPC JSON - -- Tous les avantages du Firehose. - -## Tous les avantages du Firehose? - -Développé par [StreamingFast] (https://www.streamingfast.io/), le Firehose est une couche d'extraction de données de blockchain conçue à partir de zéro pour traiter l'historique complet des blockchains à des vitesses jusqu'alors inconnues . Obtenez une approche basée sur les fichiers et le streaming, il s'agit d'un composant essentiel de la suite de technologies open-source de StreamingFast et de la base de Substreams. - -Consultez la [documentation] (https://firehose.streamingfast.io/) pour en savoir plus sur le Firehose. - -## Quels sont les avantages du Firehose ? - -L'utilisation de Firehose présente de nombreux avantages, notamment: - -- Temps de latence le plus faible et pas d'interrogation : Les nœuds Firehose sont conçus pour faire la course afin de diffuser les données en bloc en premier, selon le principe "streaming-first". - -- Prévient les temps d'arrêt : Conçu dès le départ pour une haute disponibilité. - -- Ne manquez jamais le rythme : Le curseur du flux Firehose est conçu pour gérer les bifurcations et pour reprendre là où vous vous êtes arrêté dans n'importe quelle condition. - -- Modèle de données le plus riche :   Meilleur modèle de données qui inclut les changements de solde, l'arbre d'appel complet, les transactions internes, les journaux, les changements de stockage, les coûts du gaz, etc. - -- Exploite les fichiers plats : Les données de la blockchain sont extraites dans des fichiers plats, la ressource informatique la moins chère et la plus optimisée disponible. - -## Où les développeurs peuvent-ils trouver plus d'informations sur les Substreams et les Subgraphs alimentés par Substreams ? - -La [documentation Substreams ](/substreams/introduction/) vous explique comment construire des modules Substreams. - -La [documentation sur les subgraphs alimentés par Substreams](/sps/introduction/) vous montrera comment les packager pour les déployer sur The Graph. - -Le [dernier outil Substreams Codegen ](https://streamingfastio.medium.com/substreams-codegen-no-code-tool-to-bootstrap-your-project-a11efe0378c6) vous permettra de lancer un projet Substreams sans aucun code. - -## Quel est le rôle des modules Rust dans Substreams ? - -Les modules Rust sont l'équivalent des mappeurs AssemblyScript dans Subgraphs. Ils sont compilés dans WASM de la même manière, mais le modèle de programmation permet une exécution parallèle. Ils définissent le type de transformations et d'agrégations que vous souhaitez appliquer aux données brutes de la blockchain. - -Consultez la [documentation des modules](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) pour plus de détails. - -## Qu'est-ce qui rend Substreams composable ? - -Lors de l'utilisation de Substreams, la composition a lieu au niveau de la couche de transformation, ce qui permet de réutiliser les modules mis en cache. - -Par exemple, Alice peut créer un module de prix DEX, Bob peut l'utiliser pour créer un agrégateur de volume pour certains jetons qui l'intéressent, et Lisa peut combiner quatre modules de prix DEX individuels pour créer un oracle de prix. Une seule requête Substreams regroupera tous ces modules individuels, les reliera entre eux, pour offrir un flux de données beaucoup plus raffiné. Ce flux peut ensuite être utilisé pour alimenter un subgraph et être interrogé par les consommateurs. - -## Comment pouvez-vous créer et déployer un Subgraph basé sur Substreams ? - -Après avoir [défini](/sps/introduction/) un subgraph basé sur Substreams, vous pouvez utiliser Graph CLI pour le déployer dans [Subgraph Studio](https://thegraph.com/studio/). - -## Où puis-je trouver des exemples de Substreams et de Subgraphs alimentés par Substreams ? - -Vous pouvez consulter [cette repo Github](https://github.com/pinax-network/awesome-substreams) pour trouver des exemples de Substreams et de subgraphs alimentés par Substreams. - -## Que signifient les Substreams et les subgraphs alimentés par Substreams pour The Gaph Network ? - -L'intégration promet de nombreux avantages, notamment une indexation extrêmement performante et une plus grande composabilité grâce à l'exploitation des modules de la communauté et à leur développement. diff --git a/website/src/pages/fr/substreams/sps/introduction.mdx b/website/src/pages/fr/substreams/sps/introduction.mdx deleted file mode 100644 index 0454b6f4acee..000000000000 --- a/website/src/pages/fr/substreams/sps/introduction.mdx +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: Introduction aux Subgraphs alimentés par Substreams -sidebarTitle: Présentation ---- - -Améliorez l'efficacité et l'évolutivité de votre subgraph en utilisant [Substreams](/substreams/introduction/) pour streamer des données blockchain pré-indexées. - -## Aperçu - -Utilisez un package Substreams (`.spkg`) comme source de données pour donner à votre Subgraph l'accès à un flux de données blockchain pré-indexées. Cela permet un traitement des données plus efficace et évolutif, en particulier avec des réseaux de blockchain complexes ou de grande taille. - -### Spécificités⁠ - -Il existe deux méthodes pour activer cette technologie : - -1. **Utilisation des [déclencheurs](/sps/triggers/) de Substreams ** : Consommez à partir de n'importe quel module Substreams en important le modèle Protobuf par le biais d'un gestionnaire de subgraph et déplacez toute votre logique dans un subgraph. Cette méthode crée les entités du subgraph directement dans le subgraph. - -2. **En utilisant [Entity Changes](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out)** : En écrivant une plus grande partie de la logique dans Substreams, vous pouvez consommer la sortie du module directement dans [graph-node](/indexing/tooling/graph-node/). Dans graph-node, vous pouvez utiliser les données de Substreams pour créer vos entités Subgraph. - -Vous pouvez choisir où placer votre logique, soit dans le subgraph, soit dans Substreams. Cependant, réfléchissez à ce qui correspond à vos besoins en matière de données, car Substreams a un modèle parallélisé et les déclencheurs sont consommés de manière linéaire dans graph node. - -### Ressources supplémentaires - -Consultez les liens suivants pour obtenir des tutoriels sur l'utilisation de l'outil de génération de code afin de créer rapidement votre premier projet Substreams de bout en bout : - -- [Solana](/substreams/developing/solana/transactions/) -- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm) -- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet) -- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective) -- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra) -- [Stellar](https://docs.substreams.dev/tutorials/intro-to-tutorials/stellar) diff --git a/website/src/pages/fr/substreams/sps/triggers.mdx b/website/src/pages/fr/substreams/sps/triggers.mdx deleted file mode 100644 index ecd1253f24c7..000000000000 --- a/website/src/pages/fr/substreams/sps/triggers.mdx +++ /dev/null @@ -1,47 +0,0 @@ ---- -title: Déclencheurs de Substreams ---- - -Use Custom Triggers and enable the full use GraphQL. - -## Aperçu - -Les déclencheurs personnalisés vous permettent d'envoyer des données directement dans votre fichier de mappage de subgraph et dans vos entités, qui sont similaires aux tables et aux champs. Cela vous permet d'utiliser pleinement la couche GraphQL. - -En important les définitions Protobuf émises par votre module Substreams, vous pouvez recevoir et traiter ces données dans le gestionnaire de votre subgraph. Cela garantit une gestion efficace et rationalisée des données dans le cadre du Subgraph. - -### Définition de `handleTransactions` - -Le code suivant montre comment définir une fonction `handleTransactions` dans un gestionnaire de Subgraph. Cette fonction reçoit comme paramètre de Substreams des Bytes bruts et les décode en un objet `Transactions`. Pour chaque transaction, une nouvelle entité Subgraph est créée. - -```tsx -export function handleTransactions(bytes: Uint8Array): void { - let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).transactions // 1. - if (transactions.length == 0) { - log.info('No transactions found', []) - return - } - - for (let i = 0; i < transactions.length; i++) { - // 2. - let transaction = transactions[i] - - let entity = new Transaction(transaction.hash) // 3. - entity.from = transaction.from - entity.to = transaction.to - entity.save() - } -} -``` - -Voici ce que vous voyez dans le fichier `mappings.ts` : - -1. Les bytes contenant les données Substreams sont décodés en un objet `Transactions` généré, qui est utilisé comme n’importe quel autre objet AssemblyScript -2. Boucle sur les transactions -3. Créer une nouvelle entité de subgraph pour chaque transaction - -Pour découvrir un exemple détaillé de subgraph à déclencheurs, [consultez le tutoriel](/sps/tutorial/). - -### Ressources supplémentaires - -Pour élaborer votre premier projet dans le conteneur de développement, consultez l'un des [guides pratiques](/substreams/developing/dev-container/). diff --git a/website/src/pages/fr/substreams/sps/tutorial.mdx b/website/src/pages/fr/substreams/sps/tutorial.mdx deleted file mode 100644 index d4876d6000bd..000000000000 --- a/website/src/pages/fr/substreams/sps/tutorial.mdx +++ /dev/null @@ -1,155 +0,0 @@ ---- -title: 'Tutoriel : Configurer un Subgraph alimenté par Substreams sur Solana' -sidebarTitle: Tutoriel ---- - -Mise en place réussie d'un subgraph alimenté par Substreams basé sur des déclencheurs pour un jeton Solana SPL. - -## Commencer - -Pour un tutoriel vidéo, consultez [Comment indexer Solana avec un subgraph alimenté par des Substreams](/sps/tutorial/#video-tutorial) - -### Prérequis - -Avant de commencer, assurez-vous de : - -- Avoir suivi le Guide [Getting Started](https://github.com/streamingfast/substreams-starter) pour configurer votre environnement de développement à l’aide d’un Dev Container. -- Être familier avec The Graph et des concepts de base de la blockchain tels que les transactions et les Protobufs. - -### Étape 1 : Initialiser votre projet - -1. Ouvrez votre Dev Container et exécutez la commande suivante pour initialiser votre projet : - - ```bash - substreams init - ``` - -2. Sélectionnez l'option de projet "minimal". - -3. Remplacez le contenu du fichier généré `substreams.yaml` par la configuration suivante, qui filtre les transactions du compte Orca sur l’ID du programme SPL token : - -```yaml -specVersion: v0.1.0 -package: - name: my_project_sol - version: v0.1.0 - -imports: # Passez le spkg qui vous intéresse - solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg - -modules: - - name: map_spl_transfers - use: solana:map_block # Sélectionnez les modules disponibles dans votre spkg - initialBlock: 260000082 - - - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes - -network: solana-mainnet-beta - -params: # Modifiez les champs param pour répondre à vos besoins - # Pour program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA - map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE -``` - -### Étape 2 : Générer le Manifeste du Subgraph - -Une fois le projet initialisé, générez un manifeste de subgraph en exécutant la commande suivante dans le Dev Container : - -```bash -substreams codegen subgraph -``` - -Cette commande génère un fichier `subgraph.yaml` qui importe le package Substreams comme source de données: - -```yaml ---- -dataSources: - - kind: substreams - name: my_project_sol - network: solana-mainnet-beta - source: - package: - moduleName: map_spl_transfers # Module defined in the substreams.yaml - file: ./my-project-sol-v0.1.0.spkg - mapping: - apiVersion: 0.0.9 - kind: substreams/graph-entities - file: ./src/mappings.ts - handler: handleTriggers -``` - -### Étape 3 : Définir les Entités dans `schema.graphql` - -Définissez les champs que vous souhaitez enregistrer dans vos entités Subgraph en mettant à jour le fichier `schema.graphql`. - -Voici un exemple : - -```graphql -type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! -} -``` - -Ce schéma définit une entité `MyTransfer` avec des champs tels que `id`, `amount`, `source`, `designation` et `signers`. - -### Étape 4 : Gérer les Données Substreams dans `mappings.ts` - -Avec les objets Protobuf générés, vous pouvez désormais gérer les données de Substreams décodées dans votre fichier `mappings.ts` trouvé dans le répertoire `./src`. - -L'exemple ci-dessous montre comment extraire vers les entités du subgraph les transferts non dérivés associés à l'Id du compte Orca : - -```ts -import { Protobuf } from 'as-proto/assembly' -import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' -import { MyTransfer } from '../generated/schema' - -export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) - - for (let i = 0; i < input.data.length; i++) { - const event = input.data[i] - - if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}` - const entity = new MyTransfer(entity_id) - entity.amount = event.transfer!.instruction!.amount.toString() - entity.source = event.transfer!.accounts!.source - entity.designation = event.transfer!.accounts!.destination - - if (event.transfer!.accounts!.signer!.single != null) { - entity.signers = [event.transfer!.accounts!.signer!.single!.signer] - } else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers - } - entity.save() - } - } -} -``` - -### Étape 5 : Générer les Fichiers Protobuf - -Pour générer les objets Protobuf en AssemblyScript, exécutez la commande suivante : - -```bash -npm run protogen -``` - -Cette commande convertit les définitions Protobuf en AssemblyScript, ce qui permet de les utiliser dans le gestionnaire du subgraph. - -### Conclusion - -Félicitations ! Vous avez configuré avec succès un subgraph alimenté par Substreams basé sur des déclencheurs pour un jeton Solana SPL. Vous pouvez passer à l'étape suivante en personnalisant votre schéma, vos mappages et vos modules pour les adapter à votre cas d'utilisation spécifique. - -### Tutoriel Vidéo - - - -### Ressources supplémentaires - -Pour aller plus loin en matière de personnalisation et d’optimisation, consultez la [documentation officielle de Substreams](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/src/pages/hi/substreams/sps/_meta.js b/website/src/pages/hi/substreams/sps/_meta.js deleted file mode 100644 index 86fcd3df5ec0..000000000000 --- a/website/src/pages/hi/substreams/sps/_meta.js +++ /dev/null @@ -1,6 +0,0 @@ -export default { - introduction: '', - triggers: '', - tutorial: '', - faq: '', -} diff --git a/website/src/pages/hi/substreams/sps/faq.mdx b/website/src/pages/hi/substreams/sps/faq.mdx deleted file mode 100644 index 3c77c89cebb0..000000000000 --- a/website/src/pages/hi/substreams/sps/faq.mdx +++ /dev/null @@ -1,96 +0,0 @@ ---- -title: सबस्ट्रीम्स-पावर्ड सबग्राफ FAQ -sidebarTitle: FAQ ---- - -## सबस्ट्रीम क्या होते हैं? - -सबस्ट्रीम एक अत्यधिक शक्तिशाली प्रोसेसिंग इंजन है जो ब्लॉकचेन डेटा की समृद्ध स्ट्रीम्स को उपभोग करने में सक्षम है। यह आपको ब्लॉकचेन डेटा को परिष्कृत और आकार देने की अनुमति देता है ताकि एंड-यूजर applications द्वारा इसे तेजी और सहजता से पचाया जा सके। - -यह एक ब्लॉकचेन-अज्ञेयवादी, समानांतरित, और स्ट्रीमिंग-प्रथम इंजन है, जो ब्लॉकचेन डेटा ट्रांसफॉर्मेशन लेयर के रूप में कार्य करता है। यह [Firehose](https://firehose.streamingfast.io/) द्वारा संचालित है और डेवलपर्स को Rust मॉड्यूल लिखने, कम्युनिटी मॉड्यूल्स पर निर्माण करने, बेहद उच्च-प्रदर्शन इंडेक्सिंग प्रदान करने, और अपना डेटा कहीं भी [sink](/substreams/developing/sinks/) करने में सक्षम बनाता है। - -सबस्ट्रीम को [StreamingFast](https://www.streamingfast.io/) द्वारा विकसित किया गया है। सबस्ट्रीम के बारे में अधिक जानने के लिए [सबस्ट्रीम Documentation](/substreams/introduction/) पर जाएं। - -## सबस्ट्रीम-संचालित सबग्राफ क्या हैं? - -[सबस्ट्रीम-powered सबग्राफ](/sps/introduction/)सबस्ट्रीमकी शक्ति को सबग्राफ की queryability के साथ जोड़ते हैं। जब किसी सबस्ट्रीम-powered सबग्राफ को प्रकाशित किया जाता है, तो सबस्ट्रीम परिवर्तनों द्वारा निर्मित डेटा [output entity changes](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs) उत्पन्न कर सकता है, जो सबग्राफ entities के साथ संगत होते हैं। - -यदि आप पहले से ही सबग्राफ विकास से परिचित हैं, तो ध्यान दें कि सबस्ट्रीम-संचालित सबग्राफ को उसी तरह से क्वेरी किया जा सकता है जैसे कि इसे AssemblyScript ट्रांसफॉर्मेशन लेयर द्वारा उत्पन्न किया गया हो। यह सबग्राफ के सभी लाभ प्रदान करता है, जिसमें एक डायनेमिक और लचीला GraphQL API शामिल है। - -## सबस्ट्रीम-powered सबग्राफ सामान्य सबग्राफ से कैसे भिन्न हैं? - -सबग्राफ डेटा सोर्सेस से बने होते हैं, जो ऑनचेन आयोजन को निर्धारित करते हैं और उन आयोजन को Assemblyscript में लिखे handler के माध्यम से कैसे ट्रांसफॉर्म करना चाहिए। ये आयोजन क्रमवार तरीके से प्रोसेस किए जाते हैं, जिस क्रम में ये आयोजन ऑनचेन होते हैं। - -By contrast, सबस्ट्रीम-powered सबग्राफ के पास एक ही datasource होता है जो एक सबस्ट्रीम package को संदर्भित करता है, जिसे ग्राफ नोड द्वारा प्रोसेस किया जाता है। सबस्ट्रीम को पारंपरिक सबग्राफ की तुलना में अतिरिक्त विस्तृत ऑनचेन डेटा तक पहुंच प्राप्त होती है, और यह बड़े पैमाने पर समानांतर प्रोसेसिंग से भी लाभ उठा सकते हैं, जिससे प्रोसेसिंग समय काफी तेज़ हो सकता है। - -## सबस्ट्रीम- powered सबग्राफ के उपयोग के लाभ क्या हैं? - -सबस्ट्रीम-powered सबग्राफ सभी लाभों को एक साथ लाते हैं जो सबस्ट्रीम और सबग्राफ प्रदान करते हैं। वे अधिक संयोजनशीलता और उच्च-प्रदर्शन इंडेक्सिंग को The Graph में लाते हैं। वे नए डेटा उपयोग के मामलों को भी सक्षम बनाते हैं; उदाहरण के लिए, एक बार जब आपने अपना सबस्ट्रीम-powered सबग्राफ बना लिया, तो आप अपने [सबस्ट्रीम modules](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) को पुन: उपयोग कर सकते हैं ताकि विभिन्न [sinks](https://substreams.streamingfast.io/reference-and-specs/manifests#sink) जैसे कि PostgreSQL, MongoDB, और Kafka में आउटपुट किया जा सके। - -## Substream के क्या benefit हैं? - -Substream का उपयोग करने के कई benefit हैं, जिनमें: - -- Composable: आप Substreams modules को LEGO blocks की तरह stack कर सकते हैं, और community module पर निर्माण करके public data को अधिक refining कर कते हैं। - -- High-performance indexing: बड़े पैमाने पर parallel operation के विशाल संगठनों के माध्यम से कई गुना तेज़ सूचीकरण (think BigQuery). - -- Sink anywhere: अपना डेटा कहीं भी सिंक करें: PostgreSQL, MongoDB, Kafka, Subgraphs, flat files, Google Sheets. - -- Programmable: Use code to customize extraction, do transformation-time aggregations, and model your output for multiple sinks. - -- Access to additional data which is not available as part of the JSON RPC - -- All the benefits of the Firehose. - -## What is the Firehose? - -Developed by [StreamingFast](https://www.streamingfast.io/), the Firehose is a blockchain data extraction layer designed from scratch to process the full history of blockchains at speeds that were previously unseen. Providing a files-based and streaming-first approach, it is a core component of StreamingFast's suite of open-source technologies and the foundation for Substreams. - -Go to the [documentation](https://firehose.streamingfast.io/) to learn more about the Firehose. - -## Firehouse के क्या benefits हैं? - -Firehouse का उपयोग करने के कई benefits हैं, जिनमें: - -- सबसे कम latency और कोई मतदान नहीं: streaming-first fashion में, Firehose nodes को पहले block data को push करने की दौड़ के लिए designed किया गया है। - -- Prevents downtimes: उच्च उपलब्धता के लिए मौलिक रूप से design किया गया है। - -- Never miss a beat: Firehose stream cursor को forks to handle और किसी भी स्थिति में जहां आप छोड़े थे वहां जारी रहने के लिए design किया गया है। - -- Richest data model: Best data model जिसमें balance changes, the full call tree, आंतरिक लेनदेन, logs, storage changes, gas costs और बहुत कुछ शामिल है। - -- Leverages flat files: blockchain data को flat files में निकाला जाता है, जो सबसे सस्ते और सबसे अधिक अनुकूल गणना संसाधन होता है। - -## डेवलपर्स सबस्ट्रीम-powered सबग्राफ और सबस्ट्रीम के बारे में अधिक जानकारी कहाँ प्राप्त कर सकते हैं? - -[सबस्ट्रीम documentation](/substreams/introduction/) आपको सबस्ट्रीम modules बनाने का तरीका सिखाएगी। - -The [सबस्ट्रीम-powered सबग्राफ documentation](/sps/introduction/) आपको यह दिखाएगी कि उन्हें The Graph पर परिनियोजन के लिए कैसे संकलित किया जाए। - -[नवीनतम Substreams Codegen टूल](https://streamingfastio.medium.com/substreams-codegen-no-code-tool-to-bootstrap-your-project-a11efe0378c6) आपको बिना किसी कोड के एक Substreams प्रोजेक्ट शुरू करने की अनुमति देगा। - -## Substreams में Rust modules का क्या भूमिका है? - -Rust मॉड्यूल्स AssemblyScript मापर्स के समकक्ष होते हैं सबग्राफ में। इन्हें समान तरीके से WASM में संकलित किया जाता है, लेकिन प्रोग्रामिंग मॉडल समानांतर निष्पादन की अनुमति देता है। ये उस प्रकार के रूपांतरण और समुच्चयन को परिभाषित करते हैं, जिन्हें आप कच्चे ब्लॉकचेन डेटा पर लागू करना चाहते हैं। - -See [modules documentation](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) for details. - -## Substreams को composable क्या बनाता है? - -When using Substreams, the composition happens at the transformation layer enabling cached modules to be re-used. - -ऐसे मान लीजिए, एलिस एक DEX प्राइस मॉड्यूल बना सकती है, बॉब इसका उपयोग करके अपने इच्छित कुछ टोकनों के लिए एक वॉल्यूम एग्रीगेटर बना सकता है, और लिसा चार अलग-अलग DEX प्राइस मॉड्यूल को जोड़कर एक प्राइस ओरैकल बना सकती है। एक ही सबस्ट्रीम अनुरोध इन सभी व्यक्तिगत मॉड्यूल्स को एक साथ पैकेज करेगा, उन्हें आपस में लिंक करेगा, और एक अधिक परिष्कृत डेटा स्ट्रीम प्रदान करेगा। उस स्ट्रीम का उपयोग फिर एक सबग्राफ को पॉप्युलेट करने के लिए किया जा सकता है और उपभोक्ताओं द्वारा क्वेरी किया जा सकता है। - -## आप कैसे एक Substreams-powered Subgraph बना सकते हैं और deploy कर सकते हैं? - -सबस्ट्रीम-समर्थित सबग्राफ को [परिभाषित](/sps/introduction/) करने के बाद, आप इसे Graph CLI का उपयोग करके [सबग्राफ Studio](https://thegraph.com/studio/) में डिप्लॉय कर सकते हैं। - -## आप सबस्ट्रीम और सबस्ट्रीम-powered सबग्राफ के उदाहरण कहाँ पा सकते हैं? - -आप [इस Github रिपॉज़िटरी](https://github.com/pinax-network/awesome-substreams) पर जाकर सबस्ट्रीम और सबस्ट्रीम -powered सबग्राफके उदाहरण देख सकते हैं। - -## सबस्ट्रीम और सबस्ट्रीम-powered सबग्राफ का The Graph Network के लिए क्या अर्थ है? - -The integration promises many benefits, including extremely high-performance indexing and greater composability by leveraging community modules and building on them. diff --git a/website/src/pages/hi/substreams/sps/introduction.mdx b/website/src/pages/hi/substreams/sps/introduction.mdx deleted file mode 100644 index 56ee02d1d54a..000000000000 --- a/website/src/pages/hi/substreams/sps/introduction.mdx +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: सबस्ट्रीम-पावर्ड सबग्राफ का परिचय -sidebarTitle: Introduction ---- - -अपने सबग्राफ की कार्यक्षमता और स्केलेबिलिटी को बढ़ाएं [सबस्ट्रीम](/substreams/introduction/) का उपयोग करके, जो प्री-इंडेक्स्ड ब्लॉकचेन डेटा को स्ट्रीम करता है। - -## Overview - -सबस्ट्रीम पैकेज (.spkg) को डेटा स्रोत के रूप में उपयोग करें ताकि आपका सबग्राफ पहले से इंडेक्स किए गए ब्लॉकचेन डेटा की स्ट्रीम तक पहुंच प्राप्त कर सके। यह बड़े या जटिल ब्लॉकचेन नेटवर्क के साथ अधिक कुशल और स्केलेबल डेटा हैंडलिंग को सक्षम बनाता है। - -### विशिष्टताएँ - -इस तकनीक को सक्षम करने के दो तरीके हैं: - -1. **सबस्ट्रीम [triggers](/sps/triggers/) का उपयोग करना**: किसी भी सबस्ट्रीम मॉड्यूल से उपभोग करने के लिए, Protobuf मॉडल को एक सबग्राफ हैंडलर के माध्यम से आयात करें और अपनी पूरी लॉजिक को एक सबग्राफ में स्थानांतरित करें। इस विधि से Subgraph में सीधे सबग्राफ entities बनाई जाती हैं। - -2. **[Entity Changes](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out) का उपयोग करके**: अधिक लॉजिक को सबस्ट्रीम में लिखकर, आप सीधे मॉड्यूल के आउटपुट को [`ग्राफ-नोड`](/indexing/tooling/graph-node/) में कंज्यूम कर सकते हैं। graph-node में, आप सबस्ट्रीम डेटा का उपयोग करके अपनी सबग्राफ entities बना सकते हैं। - -आप अपना लॉजिक सबग्राफ या सबस्ट्रीम में कहीं भी रख सकते हैं। हालाँकि, अपने डेटा की आवश्यकताओं के अनुसार निर्णय लें, क्योंकि सबस्ट्रीम एक समानांतर मॉडल का उपयोग करता है, और ट्रिगर `graph node` में रैखिक रूप से उपभोग किए जाते हैं। - -### Additional Resources - -इन लिंक पर जाएं ताकि आप कोड-जनरेशन टूलिंग का उपयोग करके अपना पहला एंड-टू-एंड सबस्ट्रीम प्रोजेक्ट तेजी से बना सकें: - -- [Solana](/substreams/developing/solana/transactions/) -- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm) -- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet) -- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective) -- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra) -- [Stellar](https://docs.substreams.dev/tutorials/intro-to-tutorials/stellar) diff --git a/website/src/pages/hi/substreams/sps/triggers.mdx b/website/src/pages/hi/substreams/sps/triggers.mdx deleted file mode 100644 index 196694448b05..000000000000 --- a/website/src/pages/hi/substreams/sps/triggers.mdx +++ /dev/null @@ -1,47 +0,0 @@ ---- -title: सबस्ट्रीम्स ट्रिगर्स ---- - -कस्टम ट्रिगर्स का उपयोग करें और पूर्ण रूप से GraphQL को सक्षम करें। - -## Overview - -कस्टम ट्रिगर्स आपको डेटा सीधे आपके सबग्राफ मैपिंग फ़ाइल और entities में भेजने की अनुमति देते हैं, जो तालिकाओं और फ़ील्ड्स के समान होते हैं। इससे आप पूरी तरह से GraphQL लेयर का उपयोग कर सकते हैं। - -आपके सबस्ट्रीम मॉड्यूल द्वारा उत्पन्न Protobuf परिभाषाओं को आयात करके, आप इस डेटा को अपने सबग्राफ के handler में प्राप्त और प्रोसेस कर सकते हैं। यह सबग्राफ ढांचे के भीतर कुशल और सुव्यवस्थित डेटा प्रबंधन सुनिश्चित करता है। - -### `handleTransactions` को परिभाषित करना - -यह कोड एक सबग्राफ handler में `handleTransactions` फ़ंक्शन को परिभाषित करने का तरीका दर्शाता है। यह फ़ंक्शन कच्चे सबस्ट्रीम बाइट्स को पैरामीटर के रूप में प्राप्त करता है और उन्हें `Transactions` ऑब्जेक्ट में डिकोड करता है। प्रत्येक लेन-देन के लिए, एक नया सबग्राफ entity बनाया जाता है। - -```tsx -export function handleTransactions(bytes: Uint8Array): void { - let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).transactions // 1. - if (transactions.length == 0) { - log.info('No transactions found', []) - return - } - - for (let i = 0; i < transactions.length; i++) { - // 2. - let transaction = transactions[i] - - let entity = new Transaction(transaction.hash) // 3. - entity.from = transaction.from - entity.to = transaction.to - entity.save() - } -} -``` - -यहाँ आप `mappings.ts` फ़ाइल में जो देख रहे हैं: - -1. Substreams डेटा को जनरेट किए गए Transactions ऑब्जेक्ट में डिकोड किया जाता है, यह ऑब्जेक्ट किसी अन्य AssemblyScript ऑब्जेक्ट की तरह उपयोग किया जाता है। -2. लेनदेन पर लूप करना -3. यहाँ आप `mappings.ts` फ़ाइल में जो देख रहे हैं: - -एक ट्रिगर-आधारित सबग्राफ का विस्तृत उदाहरण देखने के लिए, [इस ट्यूटोरियल को देखें](/sps/tutorial/)। - -### Additional Resources - -अपने पहले प्रोजेक्ट को डेवलपमेंट कंटेनर में स्कैफोल्ड करने के लिए, इनमें से किसी एक [How-To Guide](/substreams/developing/dev-container/) को देखें। diff --git a/website/src/pages/hi/substreams/sps/tutorial.mdx b/website/src/pages/hi/substreams/sps/tutorial.mdx deleted file mode 100644 index e7dab45640d3..000000000000 --- a/website/src/pages/hi/substreams/sps/tutorial.mdx +++ /dev/null @@ -1,155 +0,0 @@ ---- -title: 'ट्यूटोरियल: Solana पर एक Substreams-शक्ति वाले Subgraph सेट करें' -sidebarTitle: Tutorial ---- - -Successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. - -## शुरू करिये - -For a video tutorial, check out [How to Index Solana with a Substreams-powered Subgraph](/sps/tutorial/#video-tutorial) - -### आवश्यक शर्तें - -'शुरू करने से पहले, सुनिश्चित करें कि:' - -- अपने विकास पर्यावरण को सेट अप करने के लिए Getting Started Guide(https://github.com/streamingfast/substreams-starter) को पूरा करें, एक Dev Container का उपयोग करके। -- The Graph और मूल ब्लॉकचेन अवधारणाओं जैसे कि लेनदेन और Protobufs से परिचित रहें। - -### चरण 1: अपने प्रोजेक्ट को प्रारंभ करें - -1. अपने Dev Container को खोलें और अपने प्रोजेक्ट को शुरू करने के लिए निम्नलिखित कमांड चलाएं: - - ```bash - substreams प्रारंभ करें - ``` - -2. "Minimal" प्रोजेक्ट विकल्प चुनें। - -3. substreams.yaml फ़ाइल की सामग्री को निम्नलिखित कॉन्फ़िगरेशन से बदलें, जो SPL टोकन प्रोग्राम आईडी पर Orca अकाउंट के लेनदेन को फ़िल्टर करता है: - -```yaml -specVersion: v0.1.0 -package: - name: my_project_sol - version: v0.1.0 - -imports: # Pass your spkg of interest - solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg - -modules: - - name: map_spl_transfers - use: solana:map_block # Select corresponding modules available within your spkg - initialBlock: 260000082 - - - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes - -network: solana-mainnet-beta - -params: # Modify the param fields to meet your needs - # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA - map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE -``` - -### चरण 2: Subgraph Manifest उत्पन्न करें - -Once the project is initialized, generate a Subgraph manifest by running the following command in the Dev Container: - -```bash -'सबस्ट्रीम्स कोडजेन' subgraph -``` - -आप subgraph.yaml मैनिफेस्ट बनाएंगे जो डेटा स्रोत के रूप में Substreams पैकेज को इम्पोर्ट करता है: - -```yaml ---- -dataSources: - - kind: substreams - name: my_project_sol - network: solana-mainnet-beta - source: - package: - moduleName: map_spl_transfers # Module defined in the substreams.yaml - file: ./my-project-sol-v0.1.0.spkg - mapping: - apiVersion: 0.0.9 - kind: substreams/graph-entities - file: ./src/mappings.ts - handler: handleTriggers -``` - -### चरण 3: schema.graphql में संस्थाएँ परिभाषित करें - -Define the fields you want to save in your Subgraph entities by updating the `schema.graphql` file. - -Here is an example: - -```graphql -type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! -} -``` - -यह स्कीमा एक MyTransfer एंटिटी को परिभाषित करता है जिसमें फ़ील्ड्स जैसे कि id, amount, source, designation, और signers शामिल हैं। - -### चरण 4: mappings.ts में Substreams डेटा को संभालें - -With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. - -The example below demonstrates how to extract to Subgraph entities the non-derived transfers associated to the Orca account id: - -```ts -import { Protobuf } from 'as-proto/assembly' -import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' -import { MyTransfer } from '../generated/schema' - -export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) - - for (let i = 0; i < input.data.length; i++) { - const event = input.data[i] - - if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}` - const entity = new MyTransfer(entity_id) - entity.amount = event.transfer!.instruction!.amount.toString() - entity.source = event.transfer!.accounts!.source - entity.designation = event.transfer!.accounts!.destination - - if (event.transfer!.accounts!.signer!.single != null) { - entity.signers = [event.transfer!.accounts!.signer!.single!.signer] - } else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers - } - entity.save() - } - } -} -``` - -### चरण 5: Protobuf फ़ाइलें उत्पन्न करें - -AssemblyScript में Protobuf ऑब्जेक्ट बनाने के लिए, निम्नलिखित कमांड चलाएँ: - -```bash -npm चलाएँ protogen -``` - -This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the Subgraph's handler. - -### निष्कर्ष - -Congratulations! You've successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. You can take the next step by customizing your schema, mappings, and modules to fit your specific use case. - -### Video Tutorial - - - -### Additional Resources - -अधिक उन्नत अनुकूलन और ऑप्टिमाइजेशन के लिए, आधिकारिक Substreams documentation(https://substreams.streamingfast.io/tutorials/solana) देखें। diff --git a/website/src/pages/it/substreams/sps/_meta.js b/website/src/pages/it/substreams/sps/_meta.js deleted file mode 100644 index 86fcd3df5ec0..000000000000 --- a/website/src/pages/it/substreams/sps/_meta.js +++ /dev/null @@ -1,6 +0,0 @@ -export default { - introduction: '', - triggers: '', - tutorial: '', - faq: '', -} diff --git a/website/src/pages/it/substreams/sps/faq.mdx b/website/src/pages/it/substreams/sps/faq.mdx deleted file mode 100644 index 250c466d5929..000000000000 --- a/website/src/pages/it/substreams/sps/faq.mdx +++ /dev/null @@ -1,96 +0,0 @@ ---- -title: Substreams-Powered Subgraphs FAQ -sidebarTitle: FAQ ---- - -## What are Substreams? - -Substreams is an exceptionally powerful processing engine capable of consuming rich streams of blockchain data. It allows you to refine and shape blockchain data for fast and seamless digestion by end-user applications. - -Specifically, it's a blockchain-agnostic, parallelized, and streaming-first engine that serves as a blockchain data transformation layer. It's powered by [Firehose](https://firehose.streamingfast.io/), and enables developers to write Rust modules, build upon community modules, provide extremely high-performance indexing, and [sink](/substreams/developing/sinks/) their data anywhere. - -Substreams is developed by [StreamingFast](https://www.streamingfast.io/). Visit the [Substreams Documentation](/substreams/introduction/) to learn more about Substreams. - -## What are Substreams-powered Subgraphs? - -[Substreams-powered Subgraphs](/sps/introduction/) combine the power of Substreams with the queryability of Subgraphs. When publishing a Substreams-powered Subgraph, the data produced by the Substreams transformations, can [output entity changes](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs) that are compatible with Subgraph entities. - -If you are already familiar with Subgraph development, note that Substreams-powered Subgraphs can then be queried just as if it had been produced by the AssemblyScript transformation layer. This provides all the benefits of Subgraphs, including a dynamic and flexible GraphQL API. - -## How are Substreams-powered Subgraphs different from Subgraphs? - -Subgraphs are made up of datasources which specify onchain events, and how those events should be transformed via handlers written in Assemblyscript. These events are processed sequentially, based on the order in which events happen onchain. - -By contrast, substreams-powered Subgraphs have a single datasource which references a substreams package, which is processed by the Graph Node. Substreams have access to additional granular onchain data compared to conventional Subgraphs, and can also benefit from massively parallelised processing, which can mean much faster processing times. - -## What are the benefits of using Substreams-powered Subgraphs? - -Substreams-powered Subgraphs combine all the benefits of Substreams with the queryability of Subgraphs. They bring greater composability and high-performance indexing to The Graph. They also enable new data use cases; for example, once you've built your Substreams-powered Subgraph, you can reuse your [Substreams modules](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) to output to different [sinks](https://substreams.streamingfast.io/reference-and-specs/manifests#sink) such as PostgreSQL, MongoDB, and Kafka. - -## What are the benefits of Substreams? - -There are many benefits to using Substreams, including: - -- Composable: You can stack Substreams modules like LEGO blocks, and build upon community modules, further refining public data. - -- High-performance indexing: Orders of magnitude faster indexing through large-scale clusters of parallel operations (think BigQuery). - -- Sink anywhere: Sink your data to anywhere you want: PostgreSQL, MongoDB, Kafka, Subgraphs, flat files, Google Sheets. - -- Programmable: Use code to customize extraction, do transformation-time aggregations, and model your output for multiple sinks. - -- Access to additional data which is not available as part of the JSON RPC - -- All the benefits of the Firehose. - -## What is the Firehose? - -Developed by [StreamingFast](https://www.streamingfast.io/), the Firehose is a blockchain data extraction layer designed from scratch to process the full history of blockchains at speeds that were previously unseen. Providing a files-based and streaming-first approach, it is a core component of StreamingFast's suite of open-source technologies and the foundation for Substreams. - -Go to the [documentation](https://firehose.streamingfast.io/) to learn more about the Firehose. - -## What are the benefits of the Firehose? - -There are many benefits to using Firehose, including: - -- Lowest latency & no polling: In a streaming-first fashion, the Firehose nodes are designed to race to push out the block data first. - -- Prevents downtimes: Designed from the ground up for High Availability. - -- Never miss a beat: The Firehose stream cursor is designed to handle forks and to continue where you left off in any condition. - -- Richest data model:  Best data model that includes the balance changes, the full call tree, internal transactions, logs, storage changes, gas costs, and more. - -- Leverages flat files: Blockchain data is extracted into flat files, the cheapest and most optimized computing resource available. - -## Where can developers access more information about Substreams-powered Subgraphs and Substreams? - -The [Substreams documentation](/substreams/introduction/) will teach you how to build Substreams modules. - -The [Substreams-powered Subgraphs documentation](/sps/introduction/) will show you how to package them for deployment on The Graph. - -The [latest Substreams Codegen tool](https://streamingfastio.medium.com/substreams-codegen-no-code-tool-to-bootstrap-your-project-a11efe0378c6) will allow you to bootstrap a Substreams project without any code. - -## What is the role of Rust modules in Substreams? - -Rust modules are the equivalent of the AssemblyScript mappers in Subgraphs. They are compiled to WASM in a similar way, but the programming model allows for parallel execution. They define the sort of transformations and aggregations you want to apply to the raw blockchain data. - -See [modules documentation](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) for details. - -## What makes Substreams composable? - -When using Substreams, the composition happens at the transformation layer enabling cached modules to be re-used. - -As an example, Alice can build a DEX price module, Bob can use it to build a volume aggregator for some tokens of his interest, and Lisa can combine four individual DEX price modules to create a price oracle. A single Substreams request will package all of these individual's modules, link them together, to offer a much more refined stream of data. That stream can then be used to populate a Subgraph, and be queried by consumers. - -## How can you build and deploy a Substreams-powered Subgraph? - -After [defining](/sps/introduction/) a Substreams-powered Subgraph, you can use the Graph CLI to deploy it in [Subgraph Studio](https://thegraph.com/studio/). - -## Where can I find examples of Substreams and Substreams-powered Subgraphs? - -You can visit [this Github repo](https://github.com/pinax-network/awesome-substreams) to find examples of Substreams and Substreams-powered Subgraphs. - -## What do Substreams and Substreams-powered Subgraphs mean for The Graph Network? - -The integration promises many benefits, including extremely high-performance indexing and greater composability by leveraging community modules and building on them. diff --git a/website/src/pages/it/substreams/sps/introduction.mdx b/website/src/pages/it/substreams/sps/introduction.mdx deleted file mode 100644 index 0e5be69aa0c3..000000000000 --- a/website/src/pages/it/substreams/sps/introduction.mdx +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: Introduction to Substreams-Powered Subgraphs -sidebarTitle: Introduzione ---- - -Boost your Subgraph's efficiency and scalability by using [Substreams](/substreams/introduction/) to stream pre-indexed blockchain data. - -## Panoramica - -Use a Substreams package (`.spkg`) as a data source to give your Subgraph access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. - -### Specifics - -There are two methods of enabling this technology: - -1. **Using Substreams [triggers](/sps/triggers/)**: Consume from any Substreams module by importing the Protobuf model through a Subgraph handler and move all your logic into a Subgraph. This method creates the Subgraph entities directly in the Subgraph. - -2. **Using [Entity Changes](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out)**: By writing more of the logic into Substreams, you can consume the module's output directly into [graph-node](/indexing/tooling/graph-node/). In graph-node, you can use the Substreams data to create your Subgraph entities. - -You can choose where to place your logic, either in the Subgraph or Substreams. However, consider what aligns with your data needs, as Substreams has a parallelized model, and triggers are consumed linearly in the graph node. - -### Additional Resources - -Visit the following links for tutorials on using code-generation tooling to build your first end-to-end Substreams project quickly: - -- [Solana](/substreams/developing/solana/transactions/) -- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm) -- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet) -- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective) -- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra) -- [Stellar](https://docs.substreams.dev/tutorials/intro-to-tutorials/stellar) diff --git a/website/src/pages/it/substreams/sps/triggers.mdx b/website/src/pages/it/substreams/sps/triggers.mdx deleted file mode 100644 index 711dcaa6423a..000000000000 --- a/website/src/pages/it/substreams/sps/triggers.mdx +++ /dev/null @@ -1,47 +0,0 @@ ---- -title: Substreams Triggers ---- - -Use Custom Triggers and enable the full use GraphQL. - -## Panoramica - -Custom Triggers allow you to send data directly into your Subgraph mappings file and entities, which are similar to tables and fields. This enables you to fully use the GraphQL layer. - -By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data in your Subgraph's handler. This ensures efficient and streamlined data management within the Subgraph framework. - -### Defining `handleTransactions` - -The following code demonstrates how to define a `handleTransactions` function in a Subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new Subgraph entity is created. - -```tsx -export function handleTransactions(bytes: Uint8Array): void { - let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).transactions // 1. - if (transactions.length == 0) { - log.info('No transactions found', []) - return - } - - for (let i = 0; i < transactions.length; i++) { - // 2. - let transaction = transactions[i] - - let entity = new Transaction(transaction.hash) // 3. - entity.from = transaction.from - entity.to = transaction.to - entity.save() - } -} -``` - -Here's what you're seeing in the `mappings.ts` file: - -1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object -2. Looping over the transactions -3. Create a new Subgraph entity for every transaction - -To go through a detailed example of a trigger-based Subgraph, [check out the tutorial](/sps/tutorial/). - -### Additional Resources - -To scaffold your first project in the Development Container, check out one of the [How-To Guide](/substreams/developing/dev-container/). diff --git a/website/src/pages/it/substreams/sps/tutorial.mdx b/website/src/pages/it/substreams/sps/tutorial.mdx deleted file mode 100644 index 98708410813b..000000000000 --- a/website/src/pages/it/substreams/sps/tutorial.mdx +++ /dev/null @@ -1,155 +0,0 @@ ---- -title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' -sidebarTitle: Tutorial ---- - -Successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. - -## Iniziare - -For a video tutorial, check out [How to Index Solana with a Substreams-powered Subgraph](/sps/tutorial/#video-tutorial) - -### Prerequisites - -Before starting, make sure to: - -- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. -- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. - -### Step 1: Initialize Your Project - -1. Open your Dev Container and run the following command to initialize your project: - - ```bash - substreams init - ``` - -2. Select the "minimal" project option. - -3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: - -```yaml -specVersion: v0.1.0 -package: - name: my_project_sol - version: v0.1.0 - -imports: # Pass your spkg of interest - solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg - -modules: - - name: map_spl_transfers - use: solana:map_block # Select corresponding modules available within your spkg - initialBlock: 260000082 - - - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes - -network: solana-mainnet-beta - -params: # Modify the param fields to meet your needs - # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA - map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE -``` - -### Step 2: Generate the Subgraph Manifest - -Once the project is initialized, generate a Subgraph manifest by running the following command in the Dev Container: - -```bash -substreams codegen subgraph -``` - -You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: - -```yaml ---- -dataSources: - - kind: substreams - name: my_project_sol - network: solana-mainnet-beta - source: - package: - moduleName: map_spl_transfers # Module defined in the substreams.yaml - file: ./my-project-sol-v0.1.0.spkg - mapping: - apiVersion: 0.0.9 - kind: substreams/graph-entities - file: ./src/mappings.ts - handler: handleTriggers -``` - -### Step 3: Define Entities in `schema.graphql` - -Define the fields you want to save in your Subgraph entities by updating the `schema.graphql` file. - -Here is an example: - -```graphql -type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! -} -``` - -This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. - -### Step 4: Handle Substreams Data in `mappings.ts` - -With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. - -The example below demonstrates how to extract to Subgraph entities the non-derived transfers associated to the Orca account id: - -```ts -import { Protobuf } from 'as-proto/assembly' -import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' -import { MyTransfer } from '../generated/schema' - -export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) - - for (let i = 0; i < input.data.length; i++) { - const event = input.data[i] - - if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}` - const entity = new MyTransfer(entity_id) - entity.amount = event.transfer!.instruction!.amount.toString() - entity.source = event.transfer!.accounts!.source - entity.designation = event.transfer!.accounts!.destination - - if (event.transfer!.accounts!.signer!.single != null) { - entity.signers = [event.transfer!.accounts!.signer!.single!.signer] - } else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers - } - entity.save() - } - } -} -``` - -### Step 5: Generate Protobuf Files - -To generate Protobuf objects in AssemblyScript, run the following command: - -```bash -npm run protogen -``` - -This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the Subgraph's handler. - -### Conclusion - -Congratulations! You've successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. You can take the next step by customizing your schema, mappings, and modules to fit your specific use case. - -### Video Tutorial - - - -### Additional Resources - -For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/src/pages/ja/substreams/sps/_meta.js b/website/src/pages/ja/substreams/sps/_meta.js deleted file mode 100644 index 86fcd3df5ec0..000000000000 --- a/website/src/pages/ja/substreams/sps/_meta.js +++ /dev/null @@ -1,6 +0,0 @@ -export default { - introduction: '', - triggers: '', - tutorial: '', - faq: '', -} diff --git a/website/src/pages/ja/substreams/sps/faq.mdx b/website/src/pages/ja/substreams/sps/faq.mdx deleted file mode 100644 index c038b396b268..000000000000 --- a/website/src/pages/ja/substreams/sps/faq.mdx +++ /dev/null @@ -1,96 +0,0 @@ ---- -title: Substreams-Powered Subgraphs FAQ -sidebarTitle: FAQ ---- - -## サブストリームとは何ですか? - -Substreams is an exceptionally powerful processing engine capable of consuming rich streams of blockchain data. It allows you to refine and shape blockchain data for fast and seamless digestion by end-user applications. - -Specifically, it's a blockchain-agnostic, parallelized, and streaming-first engine that serves as a blockchain data transformation layer. It's powered by [Firehose](https://firehose.streamingfast.io/), and enables developers to write Rust modules, build upon community modules, provide extremely high-performance indexing, and [sink](/substreams/developing/sinks/) their data anywhere. - -Substreams is developed by [StreamingFast](https://www.streamingfast.io/). Visit the [Substreams Documentation](/substreams/introduction/) to learn more about Substreams. - -## What are Substreams-powered Subgraphs? - -[Substreams-powered Subgraphs](/sps/introduction/) combine the power of Substreams with the queryability of Subgraphs. When publishing a Substreams-powered Subgraph, the data produced by the Substreams transformations, can [output entity changes](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs) that are compatible with Subgraph entities. - -If you are already familiar with Subgraph development, note that Substreams-powered Subgraphs can then be queried just as if it had been produced by the AssemblyScript transformation layer. This provides all the benefits of Subgraphs, including a dynamic and flexible GraphQL API. - -## How are Substreams-powered Subgraphs different from Subgraphs? - -Subgraphs are made up of datasources which specify onchain events, and how those events should be transformed via handlers written in Assemblyscript. These events are processed sequentially, based on the order in which events happen onchain. - -By contrast, substreams-powered Subgraphs have a single datasource which references a substreams package, which is processed by the Graph Node. Substreams have access to additional granular onchain data compared to conventional Subgraphs, and can also benefit from massively parallelised processing, which can mean much faster processing times. - -## What are the benefits of using Substreams-powered Subgraphs? - -Substreams-powered Subgraphs combine all the benefits of Substreams with the queryability of Subgraphs. They bring greater composability and high-performance indexing to The Graph. They also enable new data use cases; for example, once you've built your Substreams-powered Subgraph, you can reuse your [Substreams modules](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) to output to different [sinks](https://substreams.streamingfast.io/reference-and-specs/manifests#sink) such as PostgreSQL, MongoDB, and Kafka. - -## サブストリームの利点は何ですか? - -サブストリームを使用すると、次のような多くの利点があります。 - -- コンポーザブル: レゴ ブロックのようなサブストリーム モジュールを積み重ね、コミュニティ モジュールを基にして公開データをさらに洗練させることができます。 - -- 高パフォーマンスのインデックス作成: 並列操作の大規模なクラスター (BigQuery を考えてください) を通じて、桁違いに高速なインデックス作成を実現します。 - -- Sink anywhere: Sink your data to anywhere you want: PostgreSQL, MongoDB, Kafka, Subgraphs, flat files, Google Sheets. - -- プログラム可能: コードを使用して抽出をカスタマイズし、変換時の集計を実行し、複数のシンクの出力をモデル化します。 - -- JSON RPC の一部として利用できない追加データへのアクセス - -- Firehose のすべての利点。 - -## 消防ホースとは何ですか? - -[StreamingFast](https://www.streamingfast.io/) によって開発された Firehose は、ブロックチェーンの全履歴をこれまで見たことのない速度で処理するためにゼロから設計されたブロックチェーン データ抽出レイヤーです。ファイルベースでストリーミングファーストのアプローチを提供するこれは、StreamingFast のオープンソース テクノロジ スイートの中核コンポーネントであり、サブストリームの基盤です。 - -Firehose の詳細については、[documentation](https://firehose.streamingfast.io/) にアクセスしてください。 - -## 消防ホースの利点は何ですか? - -Firehose を使用すると、次のような多くの利点があります。 - -- 最低のレイテンシーとポーリングなし: ストリーミングファーストの方式で、Firehose ノードはブロック データを最初にプッシュするために競合するように設計されています。 - -- ダウンタイムの防止: 高可用性を実現するためにゼロから設計されています。 - -- ビートを見逃すことはありません: Firehose ストリーム カーソルは、フォークを処理し、どのような状況でも中断したところから続行するように設計されています。 - -- 最も豊富なデータ モデル: 残高の変更、完全なコール ツリー、内部トランザクション、ログ、ストレージの変更、ガス料金などが含まれる最適なデータ モデル。 - -- フラット ファイルの活用: ブロックチェーン データは、利用可能な最も安価で最適化されたコンピューティング リソースであるフラット ファイルに抽出されます。 - -## Where can developers access more information about Substreams-powered Subgraphs and Substreams? - -The [Substreams documentation](/substreams/introduction/) will teach you how to build Substreams modules. - -The [Substreams-powered Subgraphs documentation](/sps/introduction/) will show you how to package them for deployment on The Graph. - -The [latest Substreams Codegen tool](https://streamingfastio.medium.com/substreams-codegen-no-code-tool-to-bootstrap-your-project-a11efe0378c6) will allow you to bootstrap a Substreams project without any code. - -## サブストリームにおけるRustモジュールの役割は何ですか? - -Rust modules are the equivalent of the AssemblyScript mappers in Subgraphs. They are compiled to WASM in a similar way, but the programming model allows for parallel execution. They define the sort of transformations and aggregations you want to apply to the raw blockchain data. - -See [modules documentation](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) for details. - -## サブストリームを構成可能にするものは何ですか? - -サブストリームを使用すると、変換レイヤーで合成が行われ、キャッシュされたモジュールを再利用できるようになります。 - -As an example, Alice can build a DEX price module, Bob can use it to build a volume aggregator for some tokens of his interest, and Lisa can combine four individual DEX price modules to create a price oracle. A single Substreams request will package all of these individual's modules, link them together, to offer a much more refined stream of data. That stream can then be used to populate a Subgraph, and be queried by consumers. - -## サブストリームを利用したサブグラフを構築してデプロイするにはどうすればよいでしょうか? - -After [defining](/sps/introduction/) a Substreams-powered Subgraph, you can use the Graph CLI to deploy it in [Subgraph Studio](https://thegraph.com/studio/). - -## Where can I find examples of Substreams and Substreams-powered Subgraphs? - -You can visit [this Github repo](https://github.com/pinax-network/awesome-substreams) to find examples of Substreams and Substreams-powered Subgraphs. - -## What do Substreams and Substreams-powered Subgraphs mean for The Graph Network? - -この統合は、非常に高いパフォーマンスのインデクシングと、コミュニティモジュールを活用し、それらを基に構築することによる大きな組み合わせ可能性を含む多くの利点を約束しています。 diff --git a/website/src/pages/ja/substreams/sps/introduction.mdx b/website/src/pages/ja/substreams/sps/introduction.mdx deleted file mode 100644 index 71fabdd0416c..000000000000 --- a/website/src/pages/ja/substreams/sps/introduction.mdx +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: Introduction to Substreams-Powered Subgraphs -sidebarTitle: イントロダクション ---- - -Boost your Subgraph's efficiency and scalability by using [Substreams](/substreams/introduction/) to stream pre-indexed blockchain data. - -## 概要 - -Use a Substreams package (`.spkg`) as a data source to give your Subgraph access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. - -### Specifics - -There are two methods of enabling this technology: - -1. **Using Substreams [triggers](/sps/triggers/)**: Consume from any Substreams module by importing the Protobuf model through a Subgraph handler and move all your logic into a Subgraph. This method creates the Subgraph entities directly in the Subgraph. - -2. **Using [Entity Changes](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out)**: By writing more of the logic into Substreams, you can consume the module's output directly into [graph-node](/indexing/tooling/graph-node/). In graph-node, you can use the Substreams data to create your Subgraph entities. - -You can choose where to place your logic, either in the Subgraph or Substreams. However, consider what aligns with your data needs, as Substreams has a parallelized model, and triggers are consumed linearly in the graph node. - -### その他のリソース - -Visit the following links for tutorials on using code-generation tooling to build your first end-to-end Substreams project quickly: - -- [Solana](/substreams/developing/solana/transactions/) -- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm) -- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet) -- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective) -- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra) -- [Stellar](https://docs.substreams.dev/tutorials/intro-to-tutorials/stellar) diff --git a/website/src/pages/ja/substreams/sps/triggers.mdx b/website/src/pages/ja/substreams/sps/triggers.mdx deleted file mode 100644 index 9ddb07c5477c..000000000000 --- a/website/src/pages/ja/substreams/sps/triggers.mdx +++ /dev/null @@ -1,47 +0,0 @@ ---- -title: Substreams Triggers ---- - -Use Custom Triggers and enable the full use GraphQL. - -## 概要 - -Custom Triggers allow you to send data directly into your Subgraph mappings file and entities, which are similar to tables and fields. This enables you to fully use the GraphQL layer. - -By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data in your Subgraph's handler. This ensures efficient and streamlined data management within the Subgraph framework. - -### Defining `handleTransactions` - -The following code demonstrates how to define a `handleTransactions` function in a Subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new Subgraph entity is created. - -```tsx -export function handleTransactions(bytes: Uint8Array): void { - let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).transactions // 1. - if (transactions.length == 0) { - log.info('No transactions found', []) - return - } - - for (let i = 0; i < transactions.length; i++) { - // 2. - let transaction = transactions[i] - - let entity = new Transaction(transaction.hash) // 3. - entity.from = transaction.from - entity.to = transaction.to - entity.save() - } -} -``` - -Here's what you're seeing in the `mappings.ts` file: - -1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object -2. Looping over the transactions -3. Create a new Subgraph entity for every transaction - -To go through a detailed example of a trigger-based Subgraph, [check out the tutorial](/sps/tutorial/). - -### その他のリソース - -To scaffold your first project in the Development Container, check out one of the [How-To Guide](/substreams/developing/dev-container/). diff --git a/website/src/pages/ja/substreams/sps/tutorial.mdx b/website/src/pages/ja/substreams/sps/tutorial.mdx deleted file mode 100644 index 33a08342de34..000000000000 --- a/website/src/pages/ja/substreams/sps/tutorial.mdx +++ /dev/null @@ -1,155 +0,0 @@ ---- -title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' -sidebarTitle: Tutorial ---- - -Successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. - -## 始めましょう - -For a video tutorial, check out [How to Index Solana with a Substreams-powered Subgraph](/sps/tutorial/#video-tutorial) - -### Prerequisites - -Before starting, make sure to: - -- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. -- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. - -### Step 1: Initialize Your Project - -1. Open your Dev Container and run the following command to initialize your project: - - ```bash - substreams init - ``` - -2. Select the "minimal" project option. - -3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: - -```yaml -specVersion: v0.1.0 -package: - name: my_project_sol - version: v0.1.0 - -imports: # Pass your spkg of interest - solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg - -modules: - - name: map_spl_transfers - use: solana:map_block # Select corresponding modules available within your spkg - initialBlock: 260000082 - - - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes - -network: solana-mainnet-beta - -params: # Modify the param fields to meet your needs - # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA - map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE -``` - -### Step 2: Generate the Subgraph Manifest - -Once the project is initialized, generate a Subgraph manifest by running the following command in the Dev Container: - -```bash -substreams codegen subgraph -``` - -You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: - -```yaml ---- -dataSources: - - kind: substreams - name: my_project_sol - network: solana-mainnet-beta - source: - package: - moduleName: map_spl_transfers # Module defined in the substreams.yaml - file: ./my-project-sol-v0.1.0.spkg - mapping: - apiVersion: 0.0.9 - kind: substreams/graph-entities - file: ./src/mappings.ts - handler: handleTriggers -``` - -### Step 3: Define Entities in `schema.graphql` - -Define the fields you want to save in your Subgraph entities by updating the `schema.graphql` file. - -Here is an example: - -```graphql -type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! -} -``` - -This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. - -### Step 4: Handle Substreams Data in `mappings.ts` - -With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. - -The example below demonstrates how to extract to Subgraph entities the non-derived transfers associated to the Orca account id: - -```ts -import { Protobuf } from 'as-proto/assembly' -import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' -import { MyTransfer } from '../generated/schema' - -export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) - - for (let i = 0; i < input.data.length; i++) { - const event = input.data[i] - - if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}` - const entity = new MyTransfer(entity_id) - entity.amount = event.transfer!.instruction!.amount.toString() - entity.source = event.transfer!.accounts!.source - entity.designation = event.transfer!.accounts!.destination - - if (event.transfer!.accounts!.signer!.single != null) { - entity.signers = [event.transfer!.accounts!.signer!.single!.signer] - } else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers - } - entity.save() - } - } -} -``` - -### Step 5: Generate Protobuf Files - -To generate Protobuf objects in AssemblyScript, run the following command: - -```bash -npm run protogen -``` - -This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the Subgraph's handler. - -### Conclusion - -Congratulations! You've successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. You can take the next step by customizing your schema, mappings, and modules to fit your specific use case. - -### Video Tutorial - - - -### その他のリソース - -For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/src/pages/ko/substreams/sps/_meta.js b/website/src/pages/ko/substreams/sps/_meta.js deleted file mode 100644 index 86fcd3df5ec0..000000000000 --- a/website/src/pages/ko/substreams/sps/_meta.js +++ /dev/null @@ -1,6 +0,0 @@ -export default { - introduction: '', - triggers: '', - tutorial: '', - faq: '', -} diff --git a/website/src/pages/ko/substreams/sps/faq.mdx b/website/src/pages/ko/substreams/sps/faq.mdx deleted file mode 100644 index 250c466d5929..000000000000 --- a/website/src/pages/ko/substreams/sps/faq.mdx +++ /dev/null @@ -1,96 +0,0 @@ ---- -title: Substreams-Powered Subgraphs FAQ -sidebarTitle: FAQ ---- - -## What are Substreams? - -Substreams is an exceptionally powerful processing engine capable of consuming rich streams of blockchain data. It allows you to refine and shape blockchain data for fast and seamless digestion by end-user applications. - -Specifically, it's a blockchain-agnostic, parallelized, and streaming-first engine that serves as a blockchain data transformation layer. It's powered by [Firehose](https://firehose.streamingfast.io/), and enables developers to write Rust modules, build upon community modules, provide extremely high-performance indexing, and [sink](/substreams/developing/sinks/) their data anywhere. - -Substreams is developed by [StreamingFast](https://www.streamingfast.io/). Visit the [Substreams Documentation](/substreams/introduction/) to learn more about Substreams. - -## What are Substreams-powered Subgraphs? - -[Substreams-powered Subgraphs](/sps/introduction/) combine the power of Substreams with the queryability of Subgraphs. When publishing a Substreams-powered Subgraph, the data produced by the Substreams transformations, can [output entity changes](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs) that are compatible with Subgraph entities. - -If you are already familiar with Subgraph development, note that Substreams-powered Subgraphs can then be queried just as if it had been produced by the AssemblyScript transformation layer. This provides all the benefits of Subgraphs, including a dynamic and flexible GraphQL API. - -## How are Substreams-powered Subgraphs different from Subgraphs? - -Subgraphs are made up of datasources which specify onchain events, and how those events should be transformed via handlers written in Assemblyscript. These events are processed sequentially, based on the order in which events happen onchain. - -By contrast, substreams-powered Subgraphs have a single datasource which references a substreams package, which is processed by the Graph Node. Substreams have access to additional granular onchain data compared to conventional Subgraphs, and can also benefit from massively parallelised processing, which can mean much faster processing times. - -## What are the benefits of using Substreams-powered Subgraphs? - -Substreams-powered Subgraphs combine all the benefits of Substreams with the queryability of Subgraphs. They bring greater composability and high-performance indexing to The Graph. They also enable new data use cases; for example, once you've built your Substreams-powered Subgraph, you can reuse your [Substreams modules](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) to output to different [sinks](https://substreams.streamingfast.io/reference-and-specs/manifests#sink) such as PostgreSQL, MongoDB, and Kafka. - -## What are the benefits of Substreams? - -There are many benefits to using Substreams, including: - -- Composable: You can stack Substreams modules like LEGO blocks, and build upon community modules, further refining public data. - -- High-performance indexing: Orders of magnitude faster indexing through large-scale clusters of parallel operations (think BigQuery). - -- Sink anywhere: Sink your data to anywhere you want: PostgreSQL, MongoDB, Kafka, Subgraphs, flat files, Google Sheets. - -- Programmable: Use code to customize extraction, do transformation-time aggregations, and model your output for multiple sinks. - -- Access to additional data which is not available as part of the JSON RPC - -- All the benefits of the Firehose. - -## What is the Firehose? - -Developed by [StreamingFast](https://www.streamingfast.io/), the Firehose is a blockchain data extraction layer designed from scratch to process the full history of blockchains at speeds that were previously unseen. Providing a files-based and streaming-first approach, it is a core component of StreamingFast's suite of open-source technologies and the foundation for Substreams. - -Go to the [documentation](https://firehose.streamingfast.io/) to learn more about the Firehose. - -## What are the benefits of the Firehose? - -There are many benefits to using Firehose, including: - -- Lowest latency & no polling: In a streaming-first fashion, the Firehose nodes are designed to race to push out the block data first. - -- Prevents downtimes: Designed from the ground up for High Availability. - -- Never miss a beat: The Firehose stream cursor is designed to handle forks and to continue where you left off in any condition. - -- Richest data model:  Best data model that includes the balance changes, the full call tree, internal transactions, logs, storage changes, gas costs, and more. - -- Leverages flat files: Blockchain data is extracted into flat files, the cheapest and most optimized computing resource available. - -## Where can developers access more information about Substreams-powered Subgraphs and Substreams? - -The [Substreams documentation](/substreams/introduction/) will teach you how to build Substreams modules. - -The [Substreams-powered Subgraphs documentation](/sps/introduction/) will show you how to package them for deployment on The Graph. - -The [latest Substreams Codegen tool](https://streamingfastio.medium.com/substreams-codegen-no-code-tool-to-bootstrap-your-project-a11efe0378c6) will allow you to bootstrap a Substreams project without any code. - -## What is the role of Rust modules in Substreams? - -Rust modules are the equivalent of the AssemblyScript mappers in Subgraphs. They are compiled to WASM in a similar way, but the programming model allows for parallel execution. They define the sort of transformations and aggregations you want to apply to the raw blockchain data. - -See [modules documentation](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) for details. - -## What makes Substreams composable? - -When using Substreams, the composition happens at the transformation layer enabling cached modules to be re-used. - -As an example, Alice can build a DEX price module, Bob can use it to build a volume aggregator for some tokens of his interest, and Lisa can combine four individual DEX price modules to create a price oracle. A single Substreams request will package all of these individual's modules, link them together, to offer a much more refined stream of data. That stream can then be used to populate a Subgraph, and be queried by consumers. - -## How can you build and deploy a Substreams-powered Subgraph? - -After [defining](/sps/introduction/) a Substreams-powered Subgraph, you can use the Graph CLI to deploy it in [Subgraph Studio](https://thegraph.com/studio/). - -## Where can I find examples of Substreams and Substreams-powered Subgraphs? - -You can visit [this Github repo](https://github.com/pinax-network/awesome-substreams) to find examples of Substreams and Substreams-powered Subgraphs. - -## What do Substreams and Substreams-powered Subgraphs mean for The Graph Network? - -The integration promises many benefits, including extremely high-performance indexing and greater composability by leveraging community modules and building on them. diff --git a/website/src/pages/ko/substreams/sps/introduction.mdx b/website/src/pages/ko/substreams/sps/introduction.mdx deleted file mode 100644 index 92d8618165dd..000000000000 --- a/website/src/pages/ko/substreams/sps/introduction.mdx +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: Introduction to Substreams-Powered Subgraphs -sidebarTitle: Introduction ---- - -Boost your Subgraph's efficiency and scalability by using [Substreams](/substreams/introduction/) to stream pre-indexed blockchain data. - -## Overview - -Use a Substreams package (`.spkg`) as a data source to give your Subgraph access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. - -### Specifics - -There are two methods of enabling this technology: - -1. **Using Substreams [triggers](/sps/triggers/)**: Consume from any Substreams module by importing the Protobuf model through a Subgraph handler and move all your logic into a Subgraph. This method creates the Subgraph entities directly in the Subgraph. - -2. **Using [Entity Changes](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out)**: By writing more of the logic into Substreams, you can consume the module's output directly into [graph-node](/indexing/tooling/graph-node/). In graph-node, you can use the Substreams data to create your Subgraph entities. - -You can choose where to place your logic, either in the Subgraph or Substreams. However, consider what aligns with your data needs, as Substreams has a parallelized model, and triggers are consumed linearly in the graph node. - -### Additional Resources - -Visit the following links for tutorials on using code-generation tooling to build your first end-to-end Substreams project quickly: - -- [Solana](/substreams/developing/solana/transactions/) -- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm) -- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet) -- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective) -- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra) -- [Stellar](https://docs.substreams.dev/tutorials/intro-to-tutorials/stellar) diff --git a/website/src/pages/ko/substreams/sps/triggers.mdx b/website/src/pages/ko/substreams/sps/triggers.mdx deleted file mode 100644 index 66687aa21889..000000000000 --- a/website/src/pages/ko/substreams/sps/triggers.mdx +++ /dev/null @@ -1,47 +0,0 @@ ---- -title: Substreams Triggers ---- - -Use Custom Triggers and enable the full use GraphQL. - -## Overview - -Custom Triggers allow you to send data directly into your Subgraph mappings file and entities, which are similar to tables and fields. This enables you to fully use the GraphQL layer. - -By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data in your Subgraph's handler. This ensures efficient and streamlined data management within the Subgraph framework. - -### Defining `handleTransactions` - -The following code demonstrates how to define a `handleTransactions` function in a Subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new Subgraph entity is created. - -```tsx -export function handleTransactions(bytes: Uint8Array): void { - let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).transactions // 1. - if (transactions.length == 0) { - log.info('No transactions found', []) - return - } - - for (let i = 0; i < transactions.length; i++) { - // 2. - let transaction = transactions[i] - - let entity = new Transaction(transaction.hash) // 3. - entity.from = transaction.from - entity.to = transaction.to - entity.save() - } -} -``` - -Here's what you're seeing in the `mappings.ts` file: - -1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object -2. Looping over the transactions -3. Create a new Subgraph entity for every transaction - -To go through a detailed example of a trigger-based Subgraph, [check out the tutorial](/sps/tutorial/). - -### Additional Resources - -To scaffold your first project in the Development Container, check out one of the [How-To Guide](/substreams/developing/dev-container/). diff --git a/website/src/pages/ko/substreams/sps/tutorial.mdx b/website/src/pages/ko/substreams/sps/tutorial.mdx deleted file mode 100644 index 7358f8c02a20..000000000000 --- a/website/src/pages/ko/substreams/sps/tutorial.mdx +++ /dev/null @@ -1,155 +0,0 @@ ---- -title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' -sidebarTitle: Tutorial ---- - -Successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. - -## Get Started - -For a video tutorial, check out [How to Index Solana with a Substreams-powered Subgraph](/sps/tutorial/#video-tutorial) - -### Prerequisites - -Before starting, make sure to: - -- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. -- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. - -### Step 1: Initialize Your Project - -1. Open your Dev Container and run the following command to initialize your project: - - ```bash - substreams init - ``` - -2. Select the "minimal" project option. - -3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: - -```yaml -specVersion: v0.1.0 -package: - name: my_project_sol - version: v0.1.0 - -imports: # Pass your spkg of interest - solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg - -modules: - - name: map_spl_transfers - use: solana:map_block # Select corresponding modules available within your spkg - initialBlock: 260000082 - - - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes - -network: solana-mainnet-beta - -params: # Modify the param fields to meet your needs - # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA - map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE -``` - -### Step 2: Generate the Subgraph Manifest - -Once the project is initialized, generate a Subgraph manifest by running the following command in the Dev Container: - -```bash -substreams codegen subgraph -``` - -You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: - -```yaml ---- -dataSources: - - kind: substreams - name: my_project_sol - network: solana-mainnet-beta - source: - package: - moduleName: map_spl_transfers # Module defined in the substreams.yaml - file: ./my-project-sol-v0.1.0.spkg - mapping: - apiVersion: 0.0.9 - kind: substreams/graph-entities - file: ./src/mappings.ts - handler: handleTriggers -``` - -### Step 3: Define Entities in `schema.graphql` - -Define the fields you want to save in your Subgraph entities by updating the `schema.graphql` file. - -Here is an example: - -```graphql -type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! -} -``` - -This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. - -### Step 4: Handle Substreams Data in `mappings.ts` - -With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. - -The example below demonstrates how to extract to Subgraph entities the non-derived transfers associated to the Orca account id: - -```ts -import { Protobuf } from 'as-proto/assembly' -import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' -import { MyTransfer } from '../generated/schema' - -export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) - - for (let i = 0; i < input.data.length; i++) { - const event = input.data[i] - - if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}` - const entity = new MyTransfer(entity_id) - entity.amount = event.transfer!.instruction!.amount.toString() - entity.source = event.transfer!.accounts!.source - entity.designation = event.transfer!.accounts!.destination - - if (event.transfer!.accounts!.signer!.single != null) { - entity.signers = [event.transfer!.accounts!.signer!.single!.signer] - } else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers - } - entity.save() - } - } -} -``` - -### Step 5: Generate Protobuf Files - -To generate Protobuf objects in AssemblyScript, run the following command: - -```bash -npm run protogen -``` - -This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the Subgraph's handler. - -### Conclusion - -Congratulations! You've successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. You can take the next step by customizing your schema, mappings, and modules to fit your specific use case. - -### Video Tutorial - - - -### Additional Resources - -For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/src/pages/mr/substreams/sps/_meta.js b/website/src/pages/mr/substreams/sps/_meta.js deleted file mode 100644 index 86fcd3df5ec0..000000000000 --- a/website/src/pages/mr/substreams/sps/_meta.js +++ /dev/null @@ -1,6 +0,0 @@ -export default { - introduction: '', - triggers: '', - tutorial: '', - faq: '', -} diff --git a/website/src/pages/mr/substreams/sps/faq.mdx b/website/src/pages/mr/substreams/sps/faq.mdx deleted file mode 100644 index 250c466d5929..000000000000 --- a/website/src/pages/mr/substreams/sps/faq.mdx +++ /dev/null @@ -1,96 +0,0 @@ ---- -title: Substreams-Powered Subgraphs FAQ -sidebarTitle: FAQ ---- - -## What are Substreams? - -Substreams is an exceptionally powerful processing engine capable of consuming rich streams of blockchain data. It allows you to refine and shape blockchain data for fast and seamless digestion by end-user applications. - -Specifically, it's a blockchain-agnostic, parallelized, and streaming-first engine that serves as a blockchain data transformation layer. It's powered by [Firehose](https://firehose.streamingfast.io/), and enables developers to write Rust modules, build upon community modules, provide extremely high-performance indexing, and [sink](/substreams/developing/sinks/) their data anywhere. - -Substreams is developed by [StreamingFast](https://www.streamingfast.io/). Visit the [Substreams Documentation](/substreams/introduction/) to learn more about Substreams. - -## What are Substreams-powered Subgraphs? - -[Substreams-powered Subgraphs](/sps/introduction/) combine the power of Substreams with the queryability of Subgraphs. When publishing a Substreams-powered Subgraph, the data produced by the Substreams transformations, can [output entity changes](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs) that are compatible with Subgraph entities. - -If you are already familiar with Subgraph development, note that Substreams-powered Subgraphs can then be queried just as if it had been produced by the AssemblyScript transformation layer. This provides all the benefits of Subgraphs, including a dynamic and flexible GraphQL API. - -## How are Substreams-powered Subgraphs different from Subgraphs? - -Subgraphs are made up of datasources which specify onchain events, and how those events should be transformed via handlers written in Assemblyscript. These events are processed sequentially, based on the order in which events happen onchain. - -By contrast, substreams-powered Subgraphs have a single datasource which references a substreams package, which is processed by the Graph Node. Substreams have access to additional granular onchain data compared to conventional Subgraphs, and can also benefit from massively parallelised processing, which can mean much faster processing times. - -## What are the benefits of using Substreams-powered Subgraphs? - -Substreams-powered Subgraphs combine all the benefits of Substreams with the queryability of Subgraphs. They bring greater composability and high-performance indexing to The Graph. They also enable new data use cases; for example, once you've built your Substreams-powered Subgraph, you can reuse your [Substreams modules](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) to output to different [sinks](https://substreams.streamingfast.io/reference-and-specs/manifests#sink) such as PostgreSQL, MongoDB, and Kafka. - -## What are the benefits of Substreams? - -There are many benefits to using Substreams, including: - -- Composable: You can stack Substreams modules like LEGO blocks, and build upon community modules, further refining public data. - -- High-performance indexing: Orders of magnitude faster indexing through large-scale clusters of parallel operations (think BigQuery). - -- Sink anywhere: Sink your data to anywhere you want: PostgreSQL, MongoDB, Kafka, Subgraphs, flat files, Google Sheets. - -- Programmable: Use code to customize extraction, do transformation-time aggregations, and model your output for multiple sinks. - -- Access to additional data which is not available as part of the JSON RPC - -- All the benefits of the Firehose. - -## What is the Firehose? - -Developed by [StreamingFast](https://www.streamingfast.io/), the Firehose is a blockchain data extraction layer designed from scratch to process the full history of blockchains at speeds that were previously unseen. Providing a files-based and streaming-first approach, it is a core component of StreamingFast's suite of open-source technologies and the foundation for Substreams. - -Go to the [documentation](https://firehose.streamingfast.io/) to learn more about the Firehose. - -## What are the benefits of the Firehose? - -There are many benefits to using Firehose, including: - -- Lowest latency & no polling: In a streaming-first fashion, the Firehose nodes are designed to race to push out the block data first. - -- Prevents downtimes: Designed from the ground up for High Availability. - -- Never miss a beat: The Firehose stream cursor is designed to handle forks and to continue where you left off in any condition. - -- Richest data model:  Best data model that includes the balance changes, the full call tree, internal transactions, logs, storage changes, gas costs, and more. - -- Leverages flat files: Blockchain data is extracted into flat files, the cheapest and most optimized computing resource available. - -## Where can developers access more information about Substreams-powered Subgraphs and Substreams? - -The [Substreams documentation](/substreams/introduction/) will teach you how to build Substreams modules. - -The [Substreams-powered Subgraphs documentation](/sps/introduction/) will show you how to package them for deployment on The Graph. - -The [latest Substreams Codegen tool](https://streamingfastio.medium.com/substreams-codegen-no-code-tool-to-bootstrap-your-project-a11efe0378c6) will allow you to bootstrap a Substreams project without any code. - -## What is the role of Rust modules in Substreams? - -Rust modules are the equivalent of the AssemblyScript mappers in Subgraphs. They are compiled to WASM in a similar way, but the programming model allows for parallel execution. They define the sort of transformations and aggregations you want to apply to the raw blockchain data. - -See [modules documentation](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) for details. - -## What makes Substreams composable? - -When using Substreams, the composition happens at the transformation layer enabling cached modules to be re-used. - -As an example, Alice can build a DEX price module, Bob can use it to build a volume aggregator for some tokens of his interest, and Lisa can combine four individual DEX price modules to create a price oracle. A single Substreams request will package all of these individual's modules, link them together, to offer a much more refined stream of data. That stream can then be used to populate a Subgraph, and be queried by consumers. - -## How can you build and deploy a Substreams-powered Subgraph? - -After [defining](/sps/introduction/) a Substreams-powered Subgraph, you can use the Graph CLI to deploy it in [Subgraph Studio](https://thegraph.com/studio/). - -## Where can I find examples of Substreams and Substreams-powered Subgraphs? - -You can visit [this Github repo](https://github.com/pinax-network/awesome-substreams) to find examples of Substreams and Substreams-powered Subgraphs. - -## What do Substreams and Substreams-powered Subgraphs mean for The Graph Network? - -The integration promises many benefits, including extremely high-performance indexing and greater composability by leveraging community modules and building on them. diff --git a/website/src/pages/mr/substreams/sps/introduction.mdx b/website/src/pages/mr/substreams/sps/introduction.mdx deleted file mode 100644 index d22d998dee0d..000000000000 --- a/website/src/pages/mr/substreams/sps/introduction.mdx +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: Introduction to Substreams-Powered Subgraphs -sidebarTitle: Introduction ---- - -Boost your Subgraph's efficiency and scalability by using [Substreams](/substreams/introduction/) to stream pre-indexed blockchain data. - -## सविश्लेषण - -Use a Substreams package (`.spkg`) as a data source to give your Subgraph access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. - -### Specifics - -There are two methods of enabling this technology: - -1. **Using Substreams [triggers](/sps/triggers/)**: Consume from any Substreams module by importing the Protobuf model through a Subgraph handler and move all your logic into a Subgraph. This method creates the Subgraph entities directly in the Subgraph. - -2. **Using [Entity Changes](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out)**: By writing more of the logic into Substreams, you can consume the module's output directly into [graph-node](/indexing/tooling/graph-node/). In graph-node, you can use the Substreams data to create your Subgraph entities. - -You can choose where to place your logic, either in the Subgraph or Substreams. However, consider what aligns with your data needs, as Substreams has a parallelized model, and triggers are consumed linearly in the graph node. - -### अतिरिक्त संसाधने - -Visit the following links for tutorials on using code-generation tooling to build your first end-to-end Substreams project quickly: - -- [Solana](/substreams/developing/solana/transactions/) -- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm) -- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet) -- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective) -- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra) -- [Stellar](https://docs.substreams.dev/tutorials/intro-to-tutorials/stellar) diff --git a/website/src/pages/mr/substreams/sps/triggers.mdx b/website/src/pages/mr/substreams/sps/triggers.mdx deleted file mode 100644 index df877d792fad..000000000000 --- a/website/src/pages/mr/substreams/sps/triggers.mdx +++ /dev/null @@ -1,47 +0,0 @@ ---- -title: Substreams Triggers ---- - -Use Custom Triggers and enable the full use GraphQL. - -## सविश्लेषण - -Custom Triggers allow you to send data directly into your Subgraph mappings file and entities, which are similar to tables and fields. This enables you to fully use the GraphQL layer. - -By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data in your Subgraph's handler. This ensures efficient and streamlined data management within the Subgraph framework. - -### Defining `handleTransactions` - -The following code demonstrates how to define a `handleTransactions` function in a Subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new Subgraph entity is created. - -```tsx -export function handleTransactions(bytes: Uint8Array): void { - let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).transactions // 1. - if (transactions.length == 0) { - log.info('No transactions found', []) - return - } - - for (let i = 0; i < transactions.length; i++) { - // 2. - let transaction = transactions[i] - - let entity = new Transaction(transaction.hash) // 3. - entity.from = transaction.from - entity.to = transaction.to - entity.save() - } -} -``` - -Here's what you're seeing in the `mappings.ts` file: - -1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object -2. Looping over the transactions -3. Create a new Subgraph entity for every transaction - -To go through a detailed example of a trigger-based Subgraph, [check out the tutorial](/sps/tutorial/). - -### अतिरिक्त संसाधने - -To scaffold your first project in the Development Container, check out one of the [How-To Guide](/substreams/developing/dev-container/). diff --git a/website/src/pages/mr/substreams/sps/tutorial.mdx b/website/src/pages/mr/substreams/sps/tutorial.mdx deleted file mode 100644 index f72e82459cc5..000000000000 --- a/website/src/pages/mr/substreams/sps/tutorial.mdx +++ /dev/null @@ -1,155 +0,0 @@ ---- -title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' -sidebarTitle: Tutorial ---- - -Successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. - -## सुरु करूया - -For a video tutorial, check out [How to Index Solana with a Substreams-powered Subgraph](/sps/tutorial/#video-tutorial) - -### Prerequisites - -Before starting, make sure to: - -- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. -- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. - -### Step 1: Initialize Your Project - -1. Open your Dev Container and run the following command to initialize your project: - - ```bash - substreams init - ``` - -2. Select the "minimal" project option. - -3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: - -```yaml -specVersion: v0.1.0 -package: - name: my_project_sol - version: v0.1.0 - -imports: # Pass your spkg of interest - solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg - -modules: - - name: map_spl_transfers - use: solana:map_block # Select corresponding modules available within your spkg - initialBlock: 260000082 - - - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes - -network: solana-mainnet-beta - -params: # Modify the param fields to meet your needs - # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA - map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE -``` - -### Step 2: Generate the Subgraph Manifest - -Once the project is initialized, generate a Subgraph manifest by running the following command in the Dev Container: - -```bash -substreams codegen subgraph -``` - -You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: - -```yaml ---- -dataSources: - - kind: substreams - name: my_project_sol - network: solana-mainnet-beta - source: - package: - moduleName: map_spl_transfers # Module defined in the substreams.yaml - file: ./my-project-sol-v0.1.0.spkg - mapping: - apiVersion: 0.0.9 - kind: substreams/graph-entities - file: ./src/mappings.ts - handler: handleTriggers -``` - -### Step 3: Define Entities in `schema.graphql` - -Define the fields you want to save in your Subgraph entities by updating the `schema.graphql` file. - -Here is an example: - -```graphql -type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! -} -``` - -This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. - -### Step 4: Handle Substreams Data in `mappings.ts` - -With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. - -The example below demonstrates how to extract to Subgraph entities the non-derived transfers associated to the Orca account id: - -```ts -import { Protobuf } from 'as-proto/assembly' -import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' -import { MyTransfer } from '../generated/schema' - -export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) - - for (let i = 0; i < input.data.length; i++) { - const event = input.data[i] - - if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}` - const entity = new MyTransfer(entity_id) - entity.amount = event.transfer!.instruction!.amount.toString() - entity.source = event.transfer!.accounts!.source - entity.designation = event.transfer!.accounts!.destination - - if (event.transfer!.accounts!.signer!.single != null) { - entity.signers = [event.transfer!.accounts!.signer!.single!.signer] - } else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers - } - entity.save() - } - } -} -``` - -### Step 5: Generate Protobuf Files - -To generate Protobuf objects in AssemblyScript, run the following command: - -```bash -npm run protogen -``` - -This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the Subgraph's handler. - -### Conclusion - -Congratulations! You've successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. You can take the next step by customizing your schema, mappings, and modules to fit your specific use case. - -### Video Tutorial - - - -### अतिरिक्त संसाधने - -For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/src/pages/nl/substreams/sps/_meta.js b/website/src/pages/nl/substreams/sps/_meta.js deleted file mode 100644 index 86fcd3df5ec0..000000000000 --- a/website/src/pages/nl/substreams/sps/_meta.js +++ /dev/null @@ -1,6 +0,0 @@ -export default { - introduction: '', - triggers: '', - tutorial: '', - faq: '', -} diff --git a/website/src/pages/nl/substreams/sps/faq.mdx b/website/src/pages/nl/substreams/sps/faq.mdx deleted file mode 100644 index 250c466d5929..000000000000 --- a/website/src/pages/nl/substreams/sps/faq.mdx +++ /dev/null @@ -1,96 +0,0 @@ ---- -title: Substreams-Powered Subgraphs FAQ -sidebarTitle: FAQ ---- - -## What are Substreams? - -Substreams is an exceptionally powerful processing engine capable of consuming rich streams of blockchain data. It allows you to refine and shape blockchain data for fast and seamless digestion by end-user applications. - -Specifically, it's a blockchain-agnostic, parallelized, and streaming-first engine that serves as a blockchain data transformation layer. It's powered by [Firehose](https://firehose.streamingfast.io/), and enables developers to write Rust modules, build upon community modules, provide extremely high-performance indexing, and [sink](/substreams/developing/sinks/) their data anywhere. - -Substreams is developed by [StreamingFast](https://www.streamingfast.io/). Visit the [Substreams Documentation](/substreams/introduction/) to learn more about Substreams. - -## What are Substreams-powered Subgraphs? - -[Substreams-powered Subgraphs](/sps/introduction/) combine the power of Substreams with the queryability of Subgraphs. When publishing a Substreams-powered Subgraph, the data produced by the Substreams transformations, can [output entity changes](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs) that are compatible with Subgraph entities. - -If you are already familiar with Subgraph development, note that Substreams-powered Subgraphs can then be queried just as if it had been produced by the AssemblyScript transformation layer. This provides all the benefits of Subgraphs, including a dynamic and flexible GraphQL API. - -## How are Substreams-powered Subgraphs different from Subgraphs? - -Subgraphs are made up of datasources which specify onchain events, and how those events should be transformed via handlers written in Assemblyscript. These events are processed sequentially, based on the order in which events happen onchain. - -By contrast, substreams-powered Subgraphs have a single datasource which references a substreams package, which is processed by the Graph Node. Substreams have access to additional granular onchain data compared to conventional Subgraphs, and can also benefit from massively parallelised processing, which can mean much faster processing times. - -## What are the benefits of using Substreams-powered Subgraphs? - -Substreams-powered Subgraphs combine all the benefits of Substreams with the queryability of Subgraphs. They bring greater composability and high-performance indexing to The Graph. They also enable new data use cases; for example, once you've built your Substreams-powered Subgraph, you can reuse your [Substreams modules](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) to output to different [sinks](https://substreams.streamingfast.io/reference-and-specs/manifests#sink) such as PostgreSQL, MongoDB, and Kafka. - -## What are the benefits of Substreams? - -There are many benefits to using Substreams, including: - -- Composable: You can stack Substreams modules like LEGO blocks, and build upon community modules, further refining public data. - -- High-performance indexing: Orders of magnitude faster indexing through large-scale clusters of parallel operations (think BigQuery). - -- Sink anywhere: Sink your data to anywhere you want: PostgreSQL, MongoDB, Kafka, Subgraphs, flat files, Google Sheets. - -- Programmable: Use code to customize extraction, do transformation-time aggregations, and model your output for multiple sinks. - -- Access to additional data which is not available as part of the JSON RPC - -- All the benefits of the Firehose. - -## What is the Firehose? - -Developed by [StreamingFast](https://www.streamingfast.io/), the Firehose is a blockchain data extraction layer designed from scratch to process the full history of blockchains at speeds that were previously unseen. Providing a files-based and streaming-first approach, it is a core component of StreamingFast's suite of open-source technologies and the foundation for Substreams. - -Go to the [documentation](https://firehose.streamingfast.io/) to learn more about the Firehose. - -## What are the benefits of the Firehose? - -There are many benefits to using Firehose, including: - -- Lowest latency & no polling: In a streaming-first fashion, the Firehose nodes are designed to race to push out the block data first. - -- Prevents downtimes: Designed from the ground up for High Availability. - -- Never miss a beat: The Firehose stream cursor is designed to handle forks and to continue where you left off in any condition. - -- Richest data model:  Best data model that includes the balance changes, the full call tree, internal transactions, logs, storage changes, gas costs, and more. - -- Leverages flat files: Blockchain data is extracted into flat files, the cheapest and most optimized computing resource available. - -## Where can developers access more information about Substreams-powered Subgraphs and Substreams? - -The [Substreams documentation](/substreams/introduction/) will teach you how to build Substreams modules. - -The [Substreams-powered Subgraphs documentation](/sps/introduction/) will show you how to package them for deployment on The Graph. - -The [latest Substreams Codegen tool](https://streamingfastio.medium.com/substreams-codegen-no-code-tool-to-bootstrap-your-project-a11efe0378c6) will allow you to bootstrap a Substreams project without any code. - -## What is the role of Rust modules in Substreams? - -Rust modules are the equivalent of the AssemblyScript mappers in Subgraphs. They are compiled to WASM in a similar way, but the programming model allows for parallel execution. They define the sort of transformations and aggregations you want to apply to the raw blockchain data. - -See [modules documentation](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) for details. - -## What makes Substreams composable? - -When using Substreams, the composition happens at the transformation layer enabling cached modules to be re-used. - -As an example, Alice can build a DEX price module, Bob can use it to build a volume aggregator for some tokens of his interest, and Lisa can combine four individual DEX price modules to create a price oracle. A single Substreams request will package all of these individual's modules, link them together, to offer a much more refined stream of data. That stream can then be used to populate a Subgraph, and be queried by consumers. - -## How can you build and deploy a Substreams-powered Subgraph? - -After [defining](/sps/introduction/) a Substreams-powered Subgraph, you can use the Graph CLI to deploy it in [Subgraph Studio](https://thegraph.com/studio/). - -## Where can I find examples of Substreams and Substreams-powered Subgraphs? - -You can visit [this Github repo](https://github.com/pinax-network/awesome-substreams) to find examples of Substreams and Substreams-powered Subgraphs. - -## What do Substreams and Substreams-powered Subgraphs mean for The Graph Network? - -The integration promises many benefits, including extremely high-performance indexing and greater composability by leveraging community modules and building on them. diff --git a/website/src/pages/nl/substreams/sps/introduction.mdx b/website/src/pages/nl/substreams/sps/introduction.mdx deleted file mode 100644 index 92d8618165dd..000000000000 --- a/website/src/pages/nl/substreams/sps/introduction.mdx +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: Introduction to Substreams-Powered Subgraphs -sidebarTitle: Introduction ---- - -Boost your Subgraph's efficiency and scalability by using [Substreams](/substreams/introduction/) to stream pre-indexed blockchain data. - -## Overview - -Use a Substreams package (`.spkg`) as a data source to give your Subgraph access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. - -### Specifics - -There are two methods of enabling this technology: - -1. **Using Substreams [triggers](/sps/triggers/)**: Consume from any Substreams module by importing the Protobuf model through a Subgraph handler and move all your logic into a Subgraph. This method creates the Subgraph entities directly in the Subgraph. - -2. **Using [Entity Changes](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out)**: By writing more of the logic into Substreams, you can consume the module's output directly into [graph-node](/indexing/tooling/graph-node/). In graph-node, you can use the Substreams data to create your Subgraph entities. - -You can choose where to place your logic, either in the Subgraph or Substreams. However, consider what aligns with your data needs, as Substreams has a parallelized model, and triggers are consumed linearly in the graph node. - -### Additional Resources - -Visit the following links for tutorials on using code-generation tooling to build your first end-to-end Substreams project quickly: - -- [Solana](/substreams/developing/solana/transactions/) -- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm) -- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet) -- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective) -- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra) -- [Stellar](https://docs.substreams.dev/tutorials/intro-to-tutorials/stellar) diff --git a/website/src/pages/nl/substreams/sps/triggers.mdx b/website/src/pages/nl/substreams/sps/triggers.mdx deleted file mode 100644 index 66687aa21889..000000000000 --- a/website/src/pages/nl/substreams/sps/triggers.mdx +++ /dev/null @@ -1,47 +0,0 @@ ---- -title: Substreams Triggers ---- - -Use Custom Triggers and enable the full use GraphQL. - -## Overview - -Custom Triggers allow you to send data directly into your Subgraph mappings file and entities, which are similar to tables and fields. This enables you to fully use the GraphQL layer. - -By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data in your Subgraph's handler. This ensures efficient and streamlined data management within the Subgraph framework. - -### Defining `handleTransactions` - -The following code demonstrates how to define a `handleTransactions` function in a Subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new Subgraph entity is created. - -```tsx -export function handleTransactions(bytes: Uint8Array): void { - let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).transactions // 1. - if (transactions.length == 0) { - log.info('No transactions found', []) - return - } - - for (let i = 0; i < transactions.length; i++) { - // 2. - let transaction = transactions[i] - - let entity = new Transaction(transaction.hash) // 3. - entity.from = transaction.from - entity.to = transaction.to - entity.save() - } -} -``` - -Here's what you're seeing in the `mappings.ts` file: - -1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object -2. Looping over the transactions -3. Create a new Subgraph entity for every transaction - -To go through a detailed example of a trigger-based Subgraph, [check out the tutorial](/sps/tutorial/). - -### Additional Resources - -To scaffold your first project in the Development Container, check out one of the [How-To Guide](/substreams/developing/dev-container/). diff --git a/website/src/pages/nl/substreams/sps/tutorial.mdx b/website/src/pages/nl/substreams/sps/tutorial.mdx deleted file mode 100644 index fe78e2e9908f..000000000000 --- a/website/src/pages/nl/substreams/sps/tutorial.mdx +++ /dev/null @@ -1,155 +0,0 @@ ---- -title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' -sidebarTitle: Tutorial ---- - -Successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. - -## Begin - -For a video tutorial, check out [How to Index Solana with a Substreams-powered Subgraph](/sps/tutorial/#video-tutorial) - -### Prerequisites - -Before starting, make sure to: - -- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. -- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. - -### Step 1: Initialize Your Project - -1. Open your Dev Container and run the following command to initialize your project: - - ```bash - substreams init - ``` - -2. Select the "minimal" project option. - -3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: - -```yaml -specVersion: v0.1.0 -package: - name: my_project_sol - version: v0.1.0 - -imports: # Pass your spkg of interest - solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg - -modules: - - name: map_spl_transfers - use: solana:map_block # Select corresponding modules available within your spkg - initialBlock: 260000082 - - - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes - -network: solana-mainnet-beta - -params: # Modify the param fields to meet your needs - # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA - map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE -``` - -### Step 2: Generate the Subgraph Manifest - -Once the project is initialized, generate a Subgraph manifest by running the following command in the Dev Container: - -```bash -substreams codegen subgraph -``` - -You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: - -```yaml ---- -dataSources: - - kind: substreams - name: my_project_sol - network: solana-mainnet-beta - source: - package: - moduleName: map_spl_transfers # Module defined in the substreams.yaml - file: ./my-project-sol-v0.1.0.spkg - mapping: - apiVersion: 0.0.9 - kind: substreams/graph-entities - file: ./src/mappings.ts - handler: handleTriggers -``` - -### Step 3: Define Entities in `schema.graphql` - -Define the fields you want to save in your Subgraph entities by updating the `schema.graphql` file. - -Here is an example: - -```graphql -type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! -} -``` - -This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. - -### Step 4: Handle Substreams Data in `mappings.ts` - -With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. - -The example below demonstrates how to extract to Subgraph entities the non-derived transfers associated to the Orca account id: - -```ts -import { Protobuf } from 'as-proto/assembly' -import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' -import { MyTransfer } from '../generated/schema' - -export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) - - for (let i = 0; i < input.data.length; i++) { - const event = input.data[i] - - if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}` - const entity = new MyTransfer(entity_id) - entity.amount = event.transfer!.instruction!.amount.toString() - entity.source = event.transfer!.accounts!.source - entity.designation = event.transfer!.accounts!.destination - - if (event.transfer!.accounts!.signer!.single != null) { - entity.signers = [event.transfer!.accounts!.signer!.single!.signer] - } else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers - } - entity.save() - } - } -} -``` - -### Step 5: Generate Protobuf Files - -To generate Protobuf objects in AssemblyScript, run the following command: - -```bash -npm run protogen -``` - -This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the Subgraph's handler. - -### Conclusion - -Congratulations! You've successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. You can take the next step by customizing your schema, mappings, and modules to fit your specific use case. - -### Video Tutorial - - - -### Additional Resources - -For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/src/pages/pl/substreams/sps/_meta.js b/website/src/pages/pl/substreams/sps/_meta.js deleted file mode 100644 index 86fcd3df5ec0..000000000000 --- a/website/src/pages/pl/substreams/sps/_meta.js +++ /dev/null @@ -1,6 +0,0 @@ -export default { - introduction: '', - triggers: '', - tutorial: '', - faq: '', -} diff --git a/website/src/pages/pl/substreams/sps/faq.mdx b/website/src/pages/pl/substreams/sps/faq.mdx deleted file mode 100644 index 250c466d5929..000000000000 --- a/website/src/pages/pl/substreams/sps/faq.mdx +++ /dev/null @@ -1,96 +0,0 @@ ---- -title: Substreams-Powered Subgraphs FAQ -sidebarTitle: FAQ ---- - -## What are Substreams? - -Substreams is an exceptionally powerful processing engine capable of consuming rich streams of blockchain data. It allows you to refine and shape blockchain data for fast and seamless digestion by end-user applications. - -Specifically, it's a blockchain-agnostic, parallelized, and streaming-first engine that serves as a blockchain data transformation layer. It's powered by [Firehose](https://firehose.streamingfast.io/), and enables developers to write Rust modules, build upon community modules, provide extremely high-performance indexing, and [sink](/substreams/developing/sinks/) their data anywhere. - -Substreams is developed by [StreamingFast](https://www.streamingfast.io/). Visit the [Substreams Documentation](/substreams/introduction/) to learn more about Substreams. - -## What are Substreams-powered Subgraphs? - -[Substreams-powered Subgraphs](/sps/introduction/) combine the power of Substreams with the queryability of Subgraphs. When publishing a Substreams-powered Subgraph, the data produced by the Substreams transformations, can [output entity changes](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs) that are compatible with Subgraph entities. - -If you are already familiar with Subgraph development, note that Substreams-powered Subgraphs can then be queried just as if it had been produced by the AssemblyScript transformation layer. This provides all the benefits of Subgraphs, including a dynamic and flexible GraphQL API. - -## How are Substreams-powered Subgraphs different from Subgraphs? - -Subgraphs are made up of datasources which specify onchain events, and how those events should be transformed via handlers written in Assemblyscript. These events are processed sequentially, based on the order in which events happen onchain. - -By contrast, substreams-powered Subgraphs have a single datasource which references a substreams package, which is processed by the Graph Node. Substreams have access to additional granular onchain data compared to conventional Subgraphs, and can also benefit from massively parallelised processing, which can mean much faster processing times. - -## What are the benefits of using Substreams-powered Subgraphs? - -Substreams-powered Subgraphs combine all the benefits of Substreams with the queryability of Subgraphs. They bring greater composability and high-performance indexing to The Graph. They also enable new data use cases; for example, once you've built your Substreams-powered Subgraph, you can reuse your [Substreams modules](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) to output to different [sinks](https://substreams.streamingfast.io/reference-and-specs/manifests#sink) such as PostgreSQL, MongoDB, and Kafka. - -## What are the benefits of Substreams? - -There are many benefits to using Substreams, including: - -- Composable: You can stack Substreams modules like LEGO blocks, and build upon community modules, further refining public data. - -- High-performance indexing: Orders of magnitude faster indexing through large-scale clusters of parallel operations (think BigQuery). - -- Sink anywhere: Sink your data to anywhere you want: PostgreSQL, MongoDB, Kafka, Subgraphs, flat files, Google Sheets. - -- Programmable: Use code to customize extraction, do transformation-time aggregations, and model your output for multiple sinks. - -- Access to additional data which is not available as part of the JSON RPC - -- All the benefits of the Firehose. - -## What is the Firehose? - -Developed by [StreamingFast](https://www.streamingfast.io/), the Firehose is a blockchain data extraction layer designed from scratch to process the full history of blockchains at speeds that were previously unseen. Providing a files-based and streaming-first approach, it is a core component of StreamingFast's suite of open-source technologies and the foundation for Substreams. - -Go to the [documentation](https://firehose.streamingfast.io/) to learn more about the Firehose. - -## What are the benefits of the Firehose? - -There are many benefits to using Firehose, including: - -- Lowest latency & no polling: In a streaming-first fashion, the Firehose nodes are designed to race to push out the block data first. - -- Prevents downtimes: Designed from the ground up for High Availability. - -- Never miss a beat: The Firehose stream cursor is designed to handle forks and to continue where you left off in any condition. - -- Richest data model:  Best data model that includes the balance changes, the full call tree, internal transactions, logs, storage changes, gas costs, and more. - -- Leverages flat files: Blockchain data is extracted into flat files, the cheapest and most optimized computing resource available. - -## Where can developers access more information about Substreams-powered Subgraphs and Substreams? - -The [Substreams documentation](/substreams/introduction/) will teach you how to build Substreams modules. - -The [Substreams-powered Subgraphs documentation](/sps/introduction/) will show you how to package them for deployment on The Graph. - -The [latest Substreams Codegen tool](https://streamingfastio.medium.com/substreams-codegen-no-code-tool-to-bootstrap-your-project-a11efe0378c6) will allow you to bootstrap a Substreams project without any code. - -## What is the role of Rust modules in Substreams? - -Rust modules are the equivalent of the AssemblyScript mappers in Subgraphs. They are compiled to WASM in a similar way, but the programming model allows for parallel execution. They define the sort of transformations and aggregations you want to apply to the raw blockchain data. - -See [modules documentation](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) for details. - -## What makes Substreams composable? - -When using Substreams, the composition happens at the transformation layer enabling cached modules to be re-used. - -As an example, Alice can build a DEX price module, Bob can use it to build a volume aggregator for some tokens of his interest, and Lisa can combine four individual DEX price modules to create a price oracle. A single Substreams request will package all of these individual's modules, link them together, to offer a much more refined stream of data. That stream can then be used to populate a Subgraph, and be queried by consumers. - -## How can you build and deploy a Substreams-powered Subgraph? - -After [defining](/sps/introduction/) a Substreams-powered Subgraph, you can use the Graph CLI to deploy it in [Subgraph Studio](https://thegraph.com/studio/). - -## Where can I find examples of Substreams and Substreams-powered Subgraphs? - -You can visit [this Github repo](https://github.com/pinax-network/awesome-substreams) to find examples of Substreams and Substreams-powered Subgraphs. - -## What do Substreams and Substreams-powered Subgraphs mean for The Graph Network? - -The integration promises many benefits, including extremely high-performance indexing and greater composability by leveraging community modules and building on them. diff --git a/website/src/pages/pl/substreams/sps/introduction.mdx b/website/src/pages/pl/substreams/sps/introduction.mdx deleted file mode 100644 index 8c9483eb8feb..000000000000 --- a/website/src/pages/pl/substreams/sps/introduction.mdx +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: Introduction to Substreams-Powered Subgraphs -sidebarTitle: Wstęp ---- - -Boost your Subgraph's efficiency and scalability by using [Substreams](/substreams/introduction/) to stream pre-indexed blockchain data. - -## Overview - -Use a Substreams package (`.spkg`) as a data source to give your Subgraph access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. - -### Specifics - -There are two methods of enabling this technology: - -1. **Using Substreams [triggers](/sps/triggers/)**: Consume from any Substreams module by importing the Protobuf model through a Subgraph handler and move all your logic into a Subgraph. This method creates the Subgraph entities directly in the Subgraph. - -2. **Using [Entity Changes](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out)**: By writing more of the logic into Substreams, you can consume the module's output directly into [graph-node](/indexing/tooling/graph-node/). In graph-node, you can use the Substreams data to create your Subgraph entities. - -You can choose where to place your logic, either in the Subgraph or Substreams. However, consider what aligns with your data needs, as Substreams has a parallelized model, and triggers are consumed linearly in the graph node. - -### Additional Resources - -Visit the following links for tutorials on using code-generation tooling to build your first end-to-end Substreams project quickly: - -- [Solana](/substreams/developing/solana/transactions/) -- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm) -- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet) -- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective) -- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra) -- [Stellar](https://docs.substreams.dev/tutorials/intro-to-tutorials/stellar) diff --git a/website/src/pages/pl/substreams/sps/triggers.mdx b/website/src/pages/pl/substreams/sps/triggers.mdx deleted file mode 100644 index 66687aa21889..000000000000 --- a/website/src/pages/pl/substreams/sps/triggers.mdx +++ /dev/null @@ -1,47 +0,0 @@ ---- -title: Substreams Triggers ---- - -Use Custom Triggers and enable the full use GraphQL. - -## Overview - -Custom Triggers allow you to send data directly into your Subgraph mappings file and entities, which are similar to tables and fields. This enables you to fully use the GraphQL layer. - -By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data in your Subgraph's handler. This ensures efficient and streamlined data management within the Subgraph framework. - -### Defining `handleTransactions` - -The following code demonstrates how to define a `handleTransactions` function in a Subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new Subgraph entity is created. - -```tsx -export function handleTransactions(bytes: Uint8Array): void { - let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).transactions // 1. - if (transactions.length == 0) { - log.info('No transactions found', []) - return - } - - for (let i = 0; i < transactions.length; i++) { - // 2. - let transaction = transactions[i] - - let entity = new Transaction(transaction.hash) // 3. - entity.from = transaction.from - entity.to = transaction.to - entity.save() - } -} -``` - -Here's what you're seeing in the `mappings.ts` file: - -1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object -2. Looping over the transactions -3. Create a new Subgraph entity for every transaction - -To go through a detailed example of a trigger-based Subgraph, [check out the tutorial](/sps/tutorial/). - -### Additional Resources - -To scaffold your first project in the Development Container, check out one of the [How-To Guide](/substreams/developing/dev-container/). diff --git a/website/src/pages/pl/substreams/sps/tutorial.mdx b/website/src/pages/pl/substreams/sps/tutorial.mdx deleted file mode 100644 index a795de7bb32b..000000000000 --- a/website/src/pages/pl/substreams/sps/tutorial.mdx +++ /dev/null @@ -1,155 +0,0 @@ ---- -title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' -sidebarTitle: Tutorial ---- - -Successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. - -## Jak zacząć? - -For a video tutorial, check out [How to Index Solana with a Substreams-powered Subgraph](/sps/tutorial/#video-tutorial) - -### Prerequisites - -Before starting, make sure to: - -- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. -- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. - -### Step 1: Initialize Your Project - -1. Open your Dev Container and run the following command to initialize your project: - - ```bash - substreams init - ``` - -2. Select the "minimal" project option. - -3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: - -```yaml -specVersion: v0.1.0 -package: - name: my_project_sol - version: v0.1.0 - -imports: # Pass your spkg of interest - solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg - -modules: - - name: map_spl_transfers - use: solana:map_block # Select corresponding modules available within your spkg - initialBlock: 260000082 - - - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes - -network: solana-mainnet-beta - -params: # Modify the param fields to meet your needs - # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA - map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE -``` - -### Step 2: Generate the Subgraph Manifest - -Once the project is initialized, generate a Subgraph manifest by running the following command in the Dev Container: - -```bash -substreams codegen subgraph -``` - -You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: - -```yaml ---- -dataSources: - - kind: substreams - name: my_project_sol - network: solana-mainnet-beta - source: - package: - moduleName: map_spl_transfers # Module defined in the substreams.yaml - file: ./my-project-sol-v0.1.0.spkg - mapping: - apiVersion: 0.0.9 - kind: substreams/graph-entities - file: ./src/mappings.ts - handler: handleTriggers -``` - -### Step 3: Define Entities in `schema.graphql` - -Define the fields you want to save in your Subgraph entities by updating the `schema.graphql` file. - -Here is an example: - -```graphql -type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! -} -``` - -This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. - -### Step 4: Handle Substreams Data in `mappings.ts` - -With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. - -The example below demonstrates how to extract to Subgraph entities the non-derived transfers associated to the Orca account id: - -```ts -import { Protobuf } from 'as-proto/assembly' -import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' -import { MyTransfer } from '../generated/schema' - -export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) - - for (let i = 0; i < input.data.length; i++) { - const event = input.data[i] - - if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}` - const entity = new MyTransfer(entity_id) - entity.amount = event.transfer!.instruction!.amount.toString() - entity.source = event.transfer!.accounts!.source - entity.designation = event.transfer!.accounts!.destination - - if (event.transfer!.accounts!.signer!.single != null) { - entity.signers = [event.transfer!.accounts!.signer!.single!.signer] - } else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers - } - entity.save() - } - } -} -``` - -### Step 5: Generate Protobuf Files - -To generate Protobuf objects in AssemblyScript, run the following command: - -```bash -npm run protogen -``` - -This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the Subgraph's handler. - -### Conclusion - -Congratulations! You've successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. You can take the next step by customizing your schema, mappings, and modules to fit your specific use case. - -### Video Tutorial - - - -### Additional Resources - -For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/src/pages/pt/substreams/sps/_meta.js b/website/src/pages/pt/substreams/sps/_meta.js deleted file mode 100644 index 86fcd3df5ec0..000000000000 --- a/website/src/pages/pt/substreams/sps/_meta.js +++ /dev/null @@ -1,6 +0,0 @@ -export default { - introduction: '', - triggers: '', - tutorial: '', - faq: '', -} diff --git a/website/src/pages/pt/substreams/sps/faq.mdx b/website/src/pages/pt/substreams/sps/faq.mdx deleted file mode 100644 index 936b03bc0757..000000000000 --- a/website/src/pages/pt/substreams/sps/faq.mdx +++ /dev/null @@ -1,96 +0,0 @@ ---- -title: 'Perguntas Frequentes: Subgraphs Movidos pelo Substreams' -sidebarTitle: Perguntas Frequentes ---- - -## O que são Substreams? - -O Substreams é um mecanismo de processamento excecionalmente poderoso, capaz de consumir ricos fluxos de dados de blockchain. Ele permite refinar e moldar dados de blockchain, para serem digeridos rápida e continuamente por aplicativos de utilizador final. - -Specifically, it's a blockchain-agnostic, parallelized, and streaming-first engine that serves as a blockchain data transformation layer. It's powered by [Firehose](https://firehose.streamingfast.io/), and enables developers to write Rust modules, build upon community modules, provide extremely high-performance indexing, and [sink](/substreams/developing/sinks/) their data anywhere. - -O Substreams é programado pela [StreamingFast](https://www.streamingfast.io/). Para mais informações, visite a [Documentação do Substreams](/substreams/introduction/). - -## O que são subgraphs movidos por Substreams? - -[Subgraphs movidos pelo Substreams](/sps/introduction/) combinam o poder do Substreams com as queries de subgraphs. Ao editar um subgraph movido pelo Substreams, os dados produzidos pelas transformações do Substreams podem [produzir mudanças de entidade](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs) compatíveis com entidades de subgraph. - -Se já entende da programação de subgraphs, observe que subgraphs movidos a Substreams podem ser consultados do mesmo jeito que se tivessem sido produzidos pela camada de transformação em AssemblyScript; isso com todos os benefícios do Subgraph, o que inclui uma API GraphQL dinâmica e flexível. - -## Como subgraphs movidos a Substreams diferem de subgraphs? - -Os subgraphs são compostos de fontes de dados que especificam eventos on-chain, e como transformar estes eventos através de handlers escritos em AssemblyScript. Estes eventos são processados em sequência, com base na ordem em que acontecem na chain. - -Por outro lado, subgraphs movidos pelo Substreams têm uma única fonte de dados que referencia um pacote de substreams, processado pelo Graph Node. Substreams têm acesso a mais dados granulares on-chain em comparação a subgraphs convencionais, e também podem se beneficiar de um processamento paralelizado em massa, o que pode diminuir muito a espera do processamento. - -## Quais os benefícios do uso de subgraphs movidos a Substreams? - -Subgraphs movidos a Substreams combinam todos os benefícios do Substreams com o potencial de query de subgraphs. Eles também trazem mais composabilidade e indexações de alto desempenho ao The Graph. Eles também resultam em novos casos de uso de dados; por exemplo, após construir o seu Subgraph movido a Substreams, é possível reutilizar os seus [módulos de Substreams](https://substreams.streamingfast.io/documentation/develop/manifest-modules) para usar [coletores de dados](https://substreams.streamingfast.io/reference-and-specs/manifests#sink) diferentes, como PostgreSQL, MongoDB e Kafka. - -## Quais os benefícios do Substreams? - -Usar o Substreams incorre muitos benefícios, que incluem: - -- Compostável: Você pode empilhar módulos de Substreams como se fossem blocos de LEGO, e construir em cima de módulos da comunidade, para refinar dados públicos. - -- Indexação de alto desempenho: Indexação muito mais rápida através de clusters de larga escala de operações paralelas (como o BigQuery). - -- Colete dados em qualquer lugar: Mergulhe os seus dados onde quiser: PostgreSQL, MongoDB, Kafka, subgraphs, arquivos planos, Google Sheets. - -- Programável: Use códigos para personalizar a extração, realizar agregações de tempo de transformação, e modelar o seu resultado para vários sinks. - -- Acesso a dados tradicionais que não são disponíveis como parte do RPC em JSON - -- Todos os benefícios do Firehose. - -## O que é o Firehose? - -Desenvolvido pela [StreamingFast](https://www.streamingfast.io/), o Firehose é uma camada de extração de dados em blockchain desenhada do zero para processar o histórico completo de blockchains em velocidades nunca antes vistas. Com uma abordagem baseada em arquivos e que dá prioridade a transmissões, ele é um componente central do conjunto de tecnologias de código aberto da StreamingFast, e a fundação do Substreams. - -Confira a [documentação](https://firehose.streamingfast.io/) para aprender mais sobre o Firehose. - -## Quais os benefícios do Firehose? - -Há muitos benefícios do uso do Firehose, que incluem: - -- Latência menor: De forma que prioriza as transmissões, os nodes do Firehouse são desenhados para correrem para revelar os dados do bloco em primeiro lugar. - -- Evita downtimes: Desenhado do zero para Alta Disponibilidade. - -- Não perde nada: O cursor de transmissões do Firehose é desenhado para lidar com forks e continuar de onde você parou em qualquer condição. - -- Modelo rico de dados:  O melhor modelo de dados, que inclui as mudanças de saldo, a árvore de chamadas completa, transações internas, logs, mudanças de armazenamento, custos de gas, e mais. - -- Uso de arquivos planos: Dados de blockchain são extraídos em arquivos planos, o recurso de computação mais barato e otimizado disponível. - -## Onde programadores podem acessar mais informações sobre Substreams e subgraphs movidos a Substreams? - -Para aprender como construir módulos do Substreams, leia a [documentação do Substreams](/substreams/introduction/). - -Para aprender como empacotar subgraphs e implantá-los no The Graph, veja a [documentação sobre subgraphs movidos pelo Substreams](/sps/introduction/). - -A [ferramenta de Codegen no Substreams mais recente](https://streamingfastio.medium.com/substreams-codegen-no-code-tool-to-bootstrap-your-project-a11efe0378c6) permitirá ao programador inicializar um projeto no Substreams sem a necessidade de código. - -## Qual é o papel de módulos em Rust no Substreams? - -Módulos de Rust são o equivalente aos mapeadores em AssemblyScript em subgraphs. Eles são compilados em WASM de forma parecida, mas o modelo de programação permite execuções paralelas. Eles definem a categoria de transformações e agregações que você quer aplicar aos dados de blockchain crus. - -Veja a [documentação dos módulos](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) para mais detalhes. - -## O que faz o Substreams compostável? - -Ao usar o Substreams, a composição é realizada na camada de transformação, permitindo o uso de módulos em cache. - -Como exemplo, Fulana pode construir um módulo de preço de DEX, Sicrano pode usá-lo para construir um agregador de volume para alguns tokens do seu interesse, e Beltrana pode combinar quatro módulos de preço de DEX individuais para criar um oráculo de preço. Um único pedido do Substreams empacotará todos estes módulos e os interligará para oferecer uma transmissão de dados muito mais refinada. Aquela transmissão pode então ser usada para popular um subgraph, e ser consultada pelos consumidores. - -## Como construir e publicar um Subgraph movido a Substreams? - -Após [definir](/sps/introduction/) um subgraph movido pelo Substreams, é possível usar a Graph CLI para implantá-lo no [Subgraph Studio](https://thegraph.com/studio/). - -## Onde posso encontrar exemplos de Substreams e subgraphs movidos a Substreams? - -Você pode visitar [este repositório do Github](https://github.com/pinax-network/awesome-substreams) para encontrar exemplos de Substreams e subgraphs movidos a Substreams. - -## O que Substreams e subgraphs movidos a Substreams significam para a Graph Network? - -A integração promete vários benefícios, incluindo indexações de altíssimo desempenho e mais composabilidade com o uso de módulos de comunidade e construção por cima deles. diff --git a/website/src/pages/pt/substreams/sps/introduction.mdx b/website/src/pages/pt/substreams/sps/introduction.mdx deleted file mode 100644 index c355e80d015a..000000000000 --- a/website/src/pages/pt/substreams/sps/introduction.mdx +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: Introudução a Subgraphs Movidos pelo Substreams -sidebarTitle: Introdução ---- - -Melhore a eficiência e a escalabilidade do seu subgraph com o [Substreams](/substreams/introduction/) para transmitir dados pré-indexados de blockchain. - -## Visão geral - -Use um pacote Substreams (`.spkg`) como fonte de dados para que o seu subgraph ganhe acesso a um fluxo de dados de blockchain pré-indexados. Isto resulta num tratamento de dados mais eficiente e escalável, especialmente com redes de blockchain grandes ou complexas. - -### Especificações - -Há dois metodos de ativar esta tecnologia: - -1. **Usar [gatilhos](/sps/triggers/)**: isto importa o modelo do Protobuf via um handler de subgraph, permitindo que o utilizador consuma de qualquer módulo do Substreams e mude toda a sua lógica para um subgraph. Este método cria as entidades diretamente no subgraph. - -2. **[Mudanças de Entidade](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out)**: Ao inserir mais da lógica no Substreams, pode-se alimentar o rendimento do módulo diretamente no [graph-node](/indexing/tooling/graph-node/). No graph-node, os dados do Substreams podem ser usados para criar as entidades do seu subgraph. - -É possível escolher onde colocar a sua lógica, seja no subgraph ou no Substreams. Porém, considere o que supre as suas necessidades de dados; o Substreams tem um modelo paralelizado, e os gatilhos são consumidos de forma linear no graph-node. - -### Outros Recursos - -Visite os seguintes links para ver guias passo-a-passo sobre ferramentas de geração de código, para construir o seu primeiro projeto de ponta a ponta rapidamente: - -- [Solana](/substreams/developing/solana/transactions/) -- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm) -- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet) -- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective) -- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra) -- [Stellar](https://docs.substreams.dev/tutorials/intro-to-tutorials/stellar) diff --git a/website/src/pages/pt/substreams/sps/triggers.mdx b/website/src/pages/pt/substreams/sps/triggers.mdx deleted file mode 100644 index eafeca1e373f..000000000000 --- a/website/src/pages/pt/substreams/sps/triggers.mdx +++ /dev/null @@ -1,47 +0,0 @@ ---- -title: Gatilhos do Substreams ---- - -Use Gatilhos Personalizados e ative o uso completo da GraphQL. - -## Visão geral - -Com Gatilhos Personalizados, é possível enviar dados diretamente ao arquivo de mapeamento do seu subgraph e às suas entidades; sendo esses aspetos parecidos com tabelas e campos. Assim, é possível usar a camada da GraphQL livremente. - -Estes dados podem ser recebidos e processados no handler do seu subgraph ao importar as definições do Protobuf emitidas pelo seu módulo do Substreams. Assim, o tratamento de dados na estrutura do subgraph fica mais simples e eficiente. - -### Como definir `handleTransactions` - -O código a seguir demonstra como definir uma função `handleTransactions` num handler de subgraph. Esta função recebe bytes brutos do Substreams como um parâmetro e os descodifica num objeto `Transactions`. Uma nova entidade de subgraph é criada para cada transação. - -```tsx -export function handleTransactions(bytes: Uint8Array): void { - let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).transactions // 1. - if (transactions.length == 0) { - log.info('No transactions found', []) - return - } - - for (let i = 0; i < transactions.length; i++) { - // 2. - let transaction = transactions[i] - - let entity = new Transaction(transaction.hash) // 3. - entity.from = transaction.from - entity.to = transaction.to - entity.save() - } -} -``` - -Você verá isto no arquivo `mappings.ts`: - -1. Os bytes contendo dados do Substreams são descodificados no objeto `Transactions` gerado; este é usado como qualquer outro objeto AssemblyScript -2. Um loop sobre as transações -3. Uma nova entidade de subgraph é criada para cada transação - -Para ver um exemplo detalhado de um subgraph baseado em gatilhos, [clique aqui](/sps/tutorial/). - -### Outros Recursos - -Para estruturar o seu primeiro projeto no Recipiente de Programação, confira [este guia](/substreams/developing/dev-container/). diff --git a/website/src/pages/pt/substreams/sps/tutorial.mdx b/website/src/pages/pt/substreams/sps/tutorial.mdx deleted file mode 100644 index 9c0719e36008..000000000000 --- a/website/src/pages/pt/substreams/sps/tutorial.mdx +++ /dev/null @@ -1,137 +0,0 @@ ---- -title: 'Tutorial: Como Montar um Subgraph Movido a Substreams na Solana' -sidebarTitle: Tutorial ---- - -Configure um subgraph, movido pelo Substreams e baseado em gatilhos, para um token da SPL (Biblioteca de Protocolos da Solana) da Solana. - -## Como Começar - -Para ver um tutorial em vídeo sobre o assunto, [clique aqui](/sps/tutorial/#video-tutorial) - -### Pré-requisitos - -Antes de começar: - -- Complete o [Guia de Introdução](https://github.com/streamingfast/substreams-starter) para montar o seu ambiente de programação com um Recipiente de Programação. -- Familiarize-se com o The Graph e conceitos básicos de blockchain, como transações e Protobufs. - -### Passo 1: Inicialize o Seu Projeto - -1. Para inicializar o seu projeto, abra o seu Recipiente e execute o seguinte comando: - - ```bash - substreams init - ``` - -2. Selecione a opção de projeto "minimal" (mínimo). - -3. Troque os conteúdos do arquivo `substreams.yaml` gerado com a seguinte configuração, que filtra transações para a conta do Orca no ID de programa do token da SPL: - -```yaml -params: # Modifique os parâmetros a seu gosto - # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA - map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE -``` - -### Passo 2: Crie o Manifest do Subgraph - -Com o projeto inicializado, gere um manifest de subgraph com o seguinte comando no Recipiente de Programação: - -```bash -substreams codegen subgraph -``` - -Será gerado um manifest `subgraph.yaml` que importa o pacote do Substreams como uma fonte de dados: - -```yaml ---- -dataSources: - - kind: substreams - name: my_project_sol - network: solana-mainnet-beta - source: - package: - moduleName: map_spl_transfers # Módulo definido no substreams.yaml - file: ./my-project-sol-v0.1.0.spkg - mapping: - apiVersion: 0.0.9 - kind: substreams/graph-entities - file: ./src/mappings.ts - handler: handleTriggers -``` - -### Passo 3: Defina as Entidades em `schema.graphql` - -Para definir os campos a guardar nas suas entidades de subgraph, atualize o arquivo `schema.graphql`. - -Por exemplo: - -```graphql -type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! -} -``` - -Este schema define uma entidade `MyTransfer` com campos como `id`, `amount`, `source`, `designation`, e `signers`. - -### Passo 4: Controle Dados do Substreams no `mappings.ts` - -Com os objetos do Protobuf criados, agora você pode tratar os dados descodificados do Substreams no seu arquivo `mappings.ts` no diretório `./src`. - -O exemplo abaixo demonstra como extrair as transferências não derivadas associadas à id de conta do Orca para entidades de subgraph: - -```ts -import { Protobuf } from 'as-proto/assembly' -import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' -import { MyTransfer } from '../generated/schema' - -export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) - - for (let i = 0; i < input.data.length; i++) { - const event = input.data[i] - - if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}` - const entity = new MyTransfer(entity_id) - entity.amount = event.transfer!.instruction!.amount.toString() - entity.source = event.transfer!.accounts!.source - entity.designation = event.transfer!.accounts!.destination - - if (event.transfer!.accounts!.signer!.single != null) { - entity.signers = [event.transfer!.accounts!.signer!.single!.signer] - } else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers - } - entity.save() - } - } -} -``` - -### Passo 5: Crie Arquivos de Protobuf - -Para gerar objetos do Protobuf no AssemblyScript, execute: - -```bash -npm run protogen -``` - -Este comando converte as definições do Protobuf em AssemblyScript, permitindo o seu uso no handler do subgraph. - -### Conclusão - -Parabéns! Está montado um subgraph movido a Substreams, baseado em gatilhos, para um token da SPL da Solana. Agora dá para personalizar mais o seu schema, os seus mapeamentos, e os seus módulos de modo que combinem com o seu caso de uso específico. - -### Tutorial em vídeo - - - -### Outros Recursos - -Para otimizações e personalizações mais avançadas, veja a [documentação oficial do Substreams](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/src/pages/ro/substreams/sps/_meta.js b/website/src/pages/ro/substreams/sps/_meta.js deleted file mode 100644 index 86fcd3df5ec0..000000000000 --- a/website/src/pages/ro/substreams/sps/_meta.js +++ /dev/null @@ -1,6 +0,0 @@ -export default { - introduction: '', - triggers: '', - tutorial: '', - faq: '', -} diff --git a/website/src/pages/ro/substreams/sps/faq.mdx b/website/src/pages/ro/substreams/sps/faq.mdx deleted file mode 100644 index 250c466d5929..000000000000 --- a/website/src/pages/ro/substreams/sps/faq.mdx +++ /dev/null @@ -1,96 +0,0 @@ ---- -title: Substreams-Powered Subgraphs FAQ -sidebarTitle: FAQ ---- - -## What are Substreams? - -Substreams is an exceptionally powerful processing engine capable of consuming rich streams of blockchain data. It allows you to refine and shape blockchain data for fast and seamless digestion by end-user applications. - -Specifically, it's a blockchain-agnostic, parallelized, and streaming-first engine that serves as a blockchain data transformation layer. It's powered by [Firehose](https://firehose.streamingfast.io/), and enables developers to write Rust modules, build upon community modules, provide extremely high-performance indexing, and [sink](/substreams/developing/sinks/) their data anywhere. - -Substreams is developed by [StreamingFast](https://www.streamingfast.io/). Visit the [Substreams Documentation](/substreams/introduction/) to learn more about Substreams. - -## What are Substreams-powered Subgraphs? - -[Substreams-powered Subgraphs](/sps/introduction/) combine the power of Substreams with the queryability of Subgraphs. When publishing a Substreams-powered Subgraph, the data produced by the Substreams transformations, can [output entity changes](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs) that are compatible with Subgraph entities. - -If you are already familiar with Subgraph development, note that Substreams-powered Subgraphs can then be queried just as if it had been produced by the AssemblyScript transformation layer. This provides all the benefits of Subgraphs, including a dynamic and flexible GraphQL API. - -## How are Substreams-powered Subgraphs different from Subgraphs? - -Subgraphs are made up of datasources which specify onchain events, and how those events should be transformed via handlers written in Assemblyscript. These events are processed sequentially, based on the order in which events happen onchain. - -By contrast, substreams-powered Subgraphs have a single datasource which references a substreams package, which is processed by the Graph Node. Substreams have access to additional granular onchain data compared to conventional Subgraphs, and can also benefit from massively parallelised processing, which can mean much faster processing times. - -## What are the benefits of using Substreams-powered Subgraphs? - -Substreams-powered Subgraphs combine all the benefits of Substreams with the queryability of Subgraphs. They bring greater composability and high-performance indexing to The Graph. They also enable new data use cases; for example, once you've built your Substreams-powered Subgraph, you can reuse your [Substreams modules](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) to output to different [sinks](https://substreams.streamingfast.io/reference-and-specs/manifests#sink) such as PostgreSQL, MongoDB, and Kafka. - -## What are the benefits of Substreams? - -There are many benefits to using Substreams, including: - -- Composable: You can stack Substreams modules like LEGO blocks, and build upon community modules, further refining public data. - -- High-performance indexing: Orders of magnitude faster indexing through large-scale clusters of parallel operations (think BigQuery). - -- Sink anywhere: Sink your data to anywhere you want: PostgreSQL, MongoDB, Kafka, Subgraphs, flat files, Google Sheets. - -- Programmable: Use code to customize extraction, do transformation-time aggregations, and model your output for multiple sinks. - -- Access to additional data which is not available as part of the JSON RPC - -- All the benefits of the Firehose. - -## What is the Firehose? - -Developed by [StreamingFast](https://www.streamingfast.io/), the Firehose is a blockchain data extraction layer designed from scratch to process the full history of blockchains at speeds that were previously unseen. Providing a files-based and streaming-first approach, it is a core component of StreamingFast's suite of open-source technologies and the foundation for Substreams. - -Go to the [documentation](https://firehose.streamingfast.io/) to learn more about the Firehose. - -## What are the benefits of the Firehose? - -There are many benefits to using Firehose, including: - -- Lowest latency & no polling: In a streaming-first fashion, the Firehose nodes are designed to race to push out the block data first. - -- Prevents downtimes: Designed from the ground up for High Availability. - -- Never miss a beat: The Firehose stream cursor is designed to handle forks and to continue where you left off in any condition. - -- Richest data model:  Best data model that includes the balance changes, the full call tree, internal transactions, logs, storage changes, gas costs, and more. - -- Leverages flat files: Blockchain data is extracted into flat files, the cheapest and most optimized computing resource available. - -## Where can developers access more information about Substreams-powered Subgraphs and Substreams? - -The [Substreams documentation](/substreams/introduction/) will teach you how to build Substreams modules. - -The [Substreams-powered Subgraphs documentation](/sps/introduction/) will show you how to package them for deployment on The Graph. - -The [latest Substreams Codegen tool](https://streamingfastio.medium.com/substreams-codegen-no-code-tool-to-bootstrap-your-project-a11efe0378c6) will allow you to bootstrap a Substreams project without any code. - -## What is the role of Rust modules in Substreams? - -Rust modules are the equivalent of the AssemblyScript mappers in Subgraphs. They are compiled to WASM in a similar way, but the programming model allows for parallel execution. They define the sort of transformations and aggregations you want to apply to the raw blockchain data. - -See [modules documentation](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) for details. - -## What makes Substreams composable? - -When using Substreams, the composition happens at the transformation layer enabling cached modules to be re-used. - -As an example, Alice can build a DEX price module, Bob can use it to build a volume aggregator for some tokens of his interest, and Lisa can combine four individual DEX price modules to create a price oracle. A single Substreams request will package all of these individual's modules, link them together, to offer a much more refined stream of data. That stream can then be used to populate a Subgraph, and be queried by consumers. - -## How can you build and deploy a Substreams-powered Subgraph? - -After [defining](/sps/introduction/) a Substreams-powered Subgraph, you can use the Graph CLI to deploy it in [Subgraph Studio](https://thegraph.com/studio/). - -## Where can I find examples of Substreams and Substreams-powered Subgraphs? - -You can visit [this Github repo](https://github.com/pinax-network/awesome-substreams) to find examples of Substreams and Substreams-powered Subgraphs. - -## What do Substreams and Substreams-powered Subgraphs mean for The Graph Network? - -The integration promises many benefits, including extremely high-performance indexing and greater composability by leveraging community modules and building on them. diff --git a/website/src/pages/ro/substreams/sps/introduction.mdx b/website/src/pages/ro/substreams/sps/introduction.mdx deleted file mode 100644 index 92d8618165dd..000000000000 --- a/website/src/pages/ro/substreams/sps/introduction.mdx +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: Introduction to Substreams-Powered Subgraphs -sidebarTitle: Introduction ---- - -Boost your Subgraph's efficiency and scalability by using [Substreams](/substreams/introduction/) to stream pre-indexed blockchain data. - -## Overview - -Use a Substreams package (`.spkg`) as a data source to give your Subgraph access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. - -### Specifics - -There are two methods of enabling this technology: - -1. **Using Substreams [triggers](/sps/triggers/)**: Consume from any Substreams module by importing the Protobuf model through a Subgraph handler and move all your logic into a Subgraph. This method creates the Subgraph entities directly in the Subgraph. - -2. **Using [Entity Changes](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out)**: By writing more of the logic into Substreams, you can consume the module's output directly into [graph-node](/indexing/tooling/graph-node/). In graph-node, you can use the Substreams data to create your Subgraph entities. - -You can choose where to place your logic, either in the Subgraph or Substreams. However, consider what aligns with your data needs, as Substreams has a parallelized model, and triggers are consumed linearly in the graph node. - -### Additional Resources - -Visit the following links for tutorials on using code-generation tooling to build your first end-to-end Substreams project quickly: - -- [Solana](/substreams/developing/solana/transactions/) -- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm) -- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet) -- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective) -- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra) -- [Stellar](https://docs.substreams.dev/tutorials/intro-to-tutorials/stellar) diff --git a/website/src/pages/ro/substreams/sps/triggers.mdx b/website/src/pages/ro/substreams/sps/triggers.mdx deleted file mode 100644 index 66687aa21889..000000000000 --- a/website/src/pages/ro/substreams/sps/triggers.mdx +++ /dev/null @@ -1,47 +0,0 @@ ---- -title: Substreams Triggers ---- - -Use Custom Triggers and enable the full use GraphQL. - -## Overview - -Custom Triggers allow you to send data directly into your Subgraph mappings file and entities, which are similar to tables and fields. This enables you to fully use the GraphQL layer. - -By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data in your Subgraph's handler. This ensures efficient and streamlined data management within the Subgraph framework. - -### Defining `handleTransactions` - -The following code demonstrates how to define a `handleTransactions` function in a Subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new Subgraph entity is created. - -```tsx -export function handleTransactions(bytes: Uint8Array): void { - let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).transactions // 1. - if (transactions.length == 0) { - log.info('No transactions found', []) - return - } - - for (let i = 0; i < transactions.length; i++) { - // 2. - let transaction = transactions[i] - - let entity = new Transaction(transaction.hash) // 3. - entity.from = transaction.from - entity.to = transaction.to - entity.save() - } -} -``` - -Here's what you're seeing in the `mappings.ts` file: - -1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object -2. Looping over the transactions -3. Create a new Subgraph entity for every transaction - -To go through a detailed example of a trigger-based Subgraph, [check out the tutorial](/sps/tutorial/). - -### Additional Resources - -To scaffold your first project in the Development Container, check out one of the [How-To Guide](/substreams/developing/dev-container/). diff --git a/website/src/pages/ro/substreams/sps/tutorial.mdx b/website/src/pages/ro/substreams/sps/tutorial.mdx deleted file mode 100644 index 7358f8c02a20..000000000000 --- a/website/src/pages/ro/substreams/sps/tutorial.mdx +++ /dev/null @@ -1,155 +0,0 @@ ---- -title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' -sidebarTitle: Tutorial ---- - -Successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. - -## Get Started - -For a video tutorial, check out [How to Index Solana with a Substreams-powered Subgraph](/sps/tutorial/#video-tutorial) - -### Prerequisites - -Before starting, make sure to: - -- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. -- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. - -### Step 1: Initialize Your Project - -1. Open your Dev Container and run the following command to initialize your project: - - ```bash - substreams init - ``` - -2. Select the "minimal" project option. - -3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: - -```yaml -specVersion: v0.1.0 -package: - name: my_project_sol - version: v0.1.0 - -imports: # Pass your spkg of interest - solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg - -modules: - - name: map_spl_transfers - use: solana:map_block # Select corresponding modules available within your spkg - initialBlock: 260000082 - - - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes - -network: solana-mainnet-beta - -params: # Modify the param fields to meet your needs - # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA - map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE -``` - -### Step 2: Generate the Subgraph Manifest - -Once the project is initialized, generate a Subgraph manifest by running the following command in the Dev Container: - -```bash -substreams codegen subgraph -``` - -You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: - -```yaml ---- -dataSources: - - kind: substreams - name: my_project_sol - network: solana-mainnet-beta - source: - package: - moduleName: map_spl_transfers # Module defined in the substreams.yaml - file: ./my-project-sol-v0.1.0.spkg - mapping: - apiVersion: 0.0.9 - kind: substreams/graph-entities - file: ./src/mappings.ts - handler: handleTriggers -``` - -### Step 3: Define Entities in `schema.graphql` - -Define the fields you want to save in your Subgraph entities by updating the `schema.graphql` file. - -Here is an example: - -```graphql -type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! -} -``` - -This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. - -### Step 4: Handle Substreams Data in `mappings.ts` - -With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. - -The example below demonstrates how to extract to Subgraph entities the non-derived transfers associated to the Orca account id: - -```ts -import { Protobuf } from 'as-proto/assembly' -import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' -import { MyTransfer } from '../generated/schema' - -export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) - - for (let i = 0; i < input.data.length; i++) { - const event = input.data[i] - - if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}` - const entity = new MyTransfer(entity_id) - entity.amount = event.transfer!.instruction!.amount.toString() - entity.source = event.transfer!.accounts!.source - entity.designation = event.transfer!.accounts!.destination - - if (event.transfer!.accounts!.signer!.single != null) { - entity.signers = [event.transfer!.accounts!.signer!.single!.signer] - } else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers - } - entity.save() - } - } -} -``` - -### Step 5: Generate Protobuf Files - -To generate Protobuf objects in AssemblyScript, run the following command: - -```bash -npm run protogen -``` - -This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the Subgraph's handler. - -### Conclusion - -Congratulations! You've successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. You can take the next step by customizing your schema, mappings, and modules to fit your specific use case. - -### Video Tutorial - - - -### Additional Resources - -For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/src/pages/ru/substreams/sps/_meta.js b/website/src/pages/ru/substreams/sps/_meta.js deleted file mode 100644 index 86fcd3df5ec0..000000000000 --- a/website/src/pages/ru/substreams/sps/_meta.js +++ /dev/null @@ -1,6 +0,0 @@ -export default { - introduction: '', - triggers: '', - tutorial: '', - faq: '', -} diff --git a/website/src/pages/ru/substreams/sps/faq.mdx b/website/src/pages/ru/substreams/sps/faq.mdx deleted file mode 100644 index 45edab5a3d00..000000000000 --- a/website/src/pages/ru/substreams/sps/faq.mdx +++ /dev/null @@ -1,96 +0,0 @@ ---- -title: Часто задаваемые вопросы о Субграфах, работающих на основе Субпотоков -sidebarTitle: Часто задаваемые вопросы ---- - -## Что такое субпотоки? - -Substreams is an exceptionally powerful processing engine capable of consuming rich streams of blockchain data. It allows you to refine and shape blockchain data for fast and seamless digestion by end-user applications. - -Specifically, it's a blockchain-agnostic, parallelized, and streaming-first engine that serves as a blockchain data transformation layer. It's powered by [Firehose](https://firehose.streamingfast.io/), and enables developers to write Rust modules, build upon community modules, provide extremely high-performance indexing, and [sink](/substreams/developing/sinks/) their data anywhere. - -Substreams is developed by [StreamingFast](https://www.streamingfast.io/). Visit the [Substreams Documentation](/substreams/introduction/) to learn more about Substreams. - -## What are Substreams-powered Subgraphs? - -[Субграфы, работающие на основе Субпотоков](/sps/introduction/) объединяют мощь Субпотоков с возможностью запросов субграфов. При публикации субграфа, работающего на основе Субпотоков данные, полученные в результате преобразований Субпотоков, могут [генерировать изменения объектов](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs), которые совместимы с объектами субграфа. - -If you are already familiar with Subgraph development, note that Substreams-powered Subgraphs can then be queried just as if it had been produced by the AssemblyScript transformation layer. This provides all the benefits of Subgraphs, including a dynamic and flexible GraphQL API. - -## How are Substreams-powered Subgraphs different from Subgraphs? - -Субграфы состоят из источников данных, которые указывают он-чейн события и то, как эти события должны быть преобразованы с помощью обработчиков, написанных на AssemblyScript. Эти события обрабатываются последовательно, в зависимости от того, в каком порядке они происходят он-чейн. - -В отличие от этого, субграфы, работающие на основе Субпотоков имеют один источник данных, который ссылается на пакет Субпотоков, обрабатываемый Graph Node. Субпотоки имеют доступ к дополнительным детализированным данным из он-чейна в отличии от традиционных субграфов, а также могут массово использовать параллельную обработку, что значительно ускоряет время обработки. - -## What are the benefits of using Substreams-powered Subgraphs? - -Substreams-powered Subgraphs combine all the benefits of Substreams with the queryability of Subgraphs. They bring greater composability and high-performance indexing to The Graph. They also enable new data use cases; for example, once you've built your Substreams-powered Subgraph, you can reuse your [Substreams modules](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) to output to different [sinks](https://substreams.streamingfast.io/reference-and-specs/manifests#sink) such as PostgreSQL, MongoDB, and Kafka. - -## В чем преимущества Субпотоков? - -Использование Субпотоков имеет много преимуществ, в том числе: - -- Компонуемость: Вы можете объединять модули Субпотоков, как блоки LEGO, и опираться на модули сообщества, дополнительно уточняя общедоступные данные. - -- Высокопроизводительное индексирование: индексирование на порядки быстрее благодаря крупномасштабным кластерам параллельных операций (как пример, BigQuery). - -- Sink anywhere: Sink your data to anywhere you want: PostgreSQL, MongoDB, Kafka, Subgraphs, flat files, Google Sheets. - -- Программируемость: Используйте код для настройки извлечения, выполнения агрегирования во время преобразования и моделирования выходных данных для нескольких приемников. - -- Доступ к дополнительным данным, недоступным в составе JSON RPC - -- Все преимущества Firehose. - -## Что такое Firehose? - -Firehose, разработанный [StreamingFast](https://www.streamingfast.io/), представляет собой уровень извлечения данных блокчейна, разработанный с нуля для обработки полной истории блокчейнов на ранее невиданных скоростях. Обеспечивая подход, основанный на файлах и потоковой передаче, он является основным компонентом пакета технологий Streamingfast с открытым исходным кодом и основой для Субпотоков. - -Перейдите к [documentation](https://firehose.streamingfast.io/), чтобы узнать больше о Firehose. - -## В чем преимущества Firehose? - -Использование Firehose имеет много преимуществ, в том числе: - -- Наименьшая задержка и отсутствие опроса: В режиме потоковой передачи узлы Firehose спроектированы таким образом, чтобы первыми передавать данные блока. - -- Предотвращает простои: Разработан с нуля для обеспечения высокой доступности. - -- Никогда не пропустите ни одного момента: Курсор потока Firehose предназначен для обработки форков и продолжения работы с того места, где Вы остановились, в любых условиях. - -- Богатейшая модель данных:  Лучшая модель данных, которая включает изменения баланса, полное дерево вызовов, внутренние транзакции, логи, изменения в хранилище, затраты на газ и многое другое. - -- Использует плоские файлы: Данные блокчейна извлекаются в плоские файлы — самый дешевый и наиболее оптимизированный доступный вычислительный ресурс. - -## Where can developers access more information about Substreams-powered Subgraphs and Substreams? - -Из [документации по Субпотокам](/substreams/introduction/) Вы узнаете, как создавать модули Субпотоков. - -The [Substreams-powered Subgraphs documentation](/sps/introduction/) will show you how to package them for deployment on The Graph. - -[Новейший инструмент Substreams Codegen](https://streamingfastio.medium.com/substreams-codegen-no-code-tool-to-bootstrap-your-project-a11efe0378c6) позволит Вам загрузить проект Substreams без использования какого-либо кода. - -## Какова роль модулей Rust в Субпотоках? - -Rust modules are the equivalent of the AssemblyScript mappers in Subgraphs. They are compiled to WASM in a similar way, but the programming model allows for parallel execution. They define the sort of transformations and aggregations you want to apply to the raw blockchain data. - -Подробную информацию см. в [документации по модулям](https://docs.substreams.dev/reference-material/substreams-components/modules#modules). - -## Что делает Субпотоки компонуемыми? - -При использовании Субпотоков компоновка происходит на уровне преобразования, что позволяет повторно использовать кэшированные модули. - -As an example, Alice can build a DEX price module, Bob can use it to build a volume aggregator for some tokens of his interest, and Lisa can combine four individual DEX price modules to create a price oracle. A single Substreams request will package all of these individual's modules, link them together, to offer a much more refined stream of data. That stream can then be used to populate a Subgraph, and be queried by consumers. - -## Как Вы можете создать и развернуть субграф, работающий на основе Субпотоков? - -После [определения](/sps/introduction/) субграфа, работающего на основе Субпотоков, Вы можете использовать Graph CLI для его развертывания в [Subgraph Studio](https://thegraph.com/studio/). - -## Where can I find examples of Substreams and Substreams-powered Subgraphs? - -You can visit [this Github repo](https://github.com/pinax-network/awesome-substreams) to find examples of Substreams and Substreams-powered Subgraphs. - -## What do Substreams and Substreams-powered Subgraphs mean for The Graph Network? - -Интеграция обещает множество преимуществ, включая чрезвычайно высокопроизводительную индексацию и большую компонуемость за счет использования модулей сообщества и развития на их основе. diff --git a/website/src/pages/ru/substreams/sps/introduction.mdx b/website/src/pages/ru/substreams/sps/introduction.mdx deleted file mode 100644 index d4c5118ad8f6..000000000000 --- a/website/src/pages/ru/substreams/sps/introduction.mdx +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: Введение в субграфы, работающие на основе Субпотоков -sidebarTitle: Введение ---- - -Boost your Subgraph's efficiency and scalability by using [Substreams](/substreams/introduction/) to stream pre-indexed blockchain data. - -## Обзор - -Используя пакет Субпотоков (`.spkg`) в качестве источника данных, Ваш субграф получает доступ к потоку предварительно индексированных данных блокчейна. Это позволяет более эффективно и масштабируемо обрабатывать данные, особенно в крупных или сложных блокчейн-сетях. - -### Специфические особенности - -Существует два способа активации этой технологии: - -1. **Использование [триггеров](/sps/triggers/) Субпотоков**: Получайте данные из любого модуля Субпотоков, импортируя Protobuf-модель через обработчик субграфа, и переносите всю логику в субграф. Этот метод создает объекты субграфа непосредственно внутри субграфа. - -2. Использование [Изменений Объектов](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out)\*\*: Записывая большую часть логики в Субпотоки, Вы можете напрямую передавать вывод модуля в [graph-node](/indexing/tooling/graph-node/). В graph-node можно использовать данные Субпотоков для создания объектов субграфа. - -You can choose where to place your logic, either in the Subgraph or Substreams. However, consider what aligns with your data needs, as Substreams has a parallelized model, and triggers are consumed linearly in the graph node. - -### Дополнительные ресурсы - -Перейдите по следующим ссылкам, чтобы ознакомиться с руководствами по использованию инструментов для генерации кода и быстро создать свой первый проект от начала до конца: - -- [Solana](/substreams/developing/solana/transactions/) -- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm) -- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet) -- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective) -- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra) -- [Stellar](https://docs.substreams.dev/tutorials/intro-to-tutorials/stellar) diff --git a/website/src/pages/ru/substreams/sps/triggers.mdx b/website/src/pages/ru/substreams/sps/triggers.mdx deleted file mode 100644 index 3e047577c67a..000000000000 --- a/website/src/pages/ru/substreams/sps/triggers.mdx +++ /dev/null @@ -1,47 +0,0 @@ ---- -title: Триггеры Субпотоков ---- - -Use Custom Triggers and enable the full use GraphQL. - -## Обзор - -Custom Triggers allow you to send data directly into your Subgraph mappings file and entities, which are similar to tables and fields. This enables you to fully use the GraphQL layer. - -By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data in your Subgraph's handler. This ensures efficient and streamlined data management within the Subgraph framework. - -### Defining `handleTransactions` - -Следующий код демонстрирует, как определить функцию `handleTransactions` в обработчике субграфа. Эта функция принимает сырые байты Субпотоков в качестве параметра и декодирует их в объект `Transactions`. Для каждой транзакции создается новый объект субграфа. - -```tsx -export function handleTransactions(bytes: Uint8Array): void { - let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).transactions // 1. - if (transactions.length == 0) { - log.info('No transactions found', []) - return - } - - for (let i = 0; i < transactions.length; i++) { - // 2. - let transaction = transactions[i] - - let entity = new Transaction(transaction.hash) // 3. - entity.from = transaction.from - entity.to = transaction.to - entity.save() - } -} -``` - -Вот что Вы видите в файле `mappings.ts`: - -1. Байты, содержащие данные Субпотоков, декодируются в сгенерированный объект `Transactions`. Этот объект используется как любой другой объект на AssemblyScript -2. Итерация по транзакциям (процесс поочерёдного прохода по всем транзакциям для их анализа или обработки) -3. Создание нового объекта субграфа для каждой транзакции - -Чтобы ознакомиться с подробным примером субграфа на основе триггера, [ознакомьтесь с руководством](/sps/tutorial/). - -### Дополнительные ресурсы - -To scaffold your first project in the Development Container, check out one of the [How-To Guide](/substreams/developing/dev-container/). diff --git a/website/src/pages/ru/substreams/sps/tutorial.mdx b/website/src/pages/ru/substreams/sps/tutorial.mdx deleted file mode 100644 index 977f1803f352..000000000000 --- a/website/src/pages/ru/substreams/sps/tutorial.mdx +++ /dev/null @@ -1,154 +0,0 @@ ---- -title: 'Руководство: Настройка Субграфа, работающего на основе Субпотоков в сети Solana' -sidebarTitle: Tutorial ---- - -Successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. - -## Начнем - -For a video tutorial, check out [How to Index Solana with a Substreams-powered Subgraph](/sps/tutorial/#video-tutorial) - -### Предварительные требования - -Прежде чем начать, убедитесь, что: - -- Завершили изучение [руководства по началу работы](https://github.com/streamingfast/substreams-starter), чтобы настроить свою среду разработки с использованием контейнера для разработки. -- Ознакомлены с The Graph и основными концепциями блокчейна, такими как транзакции и Protobuf. - -### Шаг 1: Инициализация Вашего проекта - -1. Откройте свой контейнер для разработки и выполните следующую команду для инициализации проекта: - - ```bash - substreams init - ``` - -2. Выберите вариант проекта "minimal". - -3. Замените содержимое сгенерированного файла `substreams.yaml` следующей конфигурацией, которая фильтрует транзакции для аккаунта Orca в идентификаторе программы токенов SPL: - -```yaml -specVersion: v0.1.0 -package: - name: my_project_sol - version: v0.1.0 - -imports: # Укажите нужный Вам spkg - solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg - -modules: - - name: map_spl_transfers - use: solana:map_block # Выберите соответствующие модули, доступные в Вашем spkg - initialBlock: 260000082 - - - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes - -network: solana-mainnet-beta - -params: # Измените параметры в соответствии со своими требованиями - # Для program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA: map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE -``` - -### Шаг 2: Создание манифеста субграфа - -После инициализации проекта создайте манифест субграфа, выполнив следующую команду в Dev Container: - -```bash -substreams codegen subgraph -``` - -Вы создадите манифест `subgraph.yaml`, который импортирует пакет Субпотоков в качестве источника данных: - -```yaml ---- -dataSources: - - kind: substreams - name: my_project_sol - network: solana-mainnet-beta - source: - package: - moduleName: map_spl_transfers # Модуль, определенный в substreams.yaml - file: ./my-project-sol-v0.1.0.spkg - mapping: - apiVersion: 0.0.9 - kind: substreams/graph-entities - file: ./src/mappings.ts - handler: handleTriggers -``` - -### Шаг 3: Определите объекты в `schema.graphql` - -Определите поля, которые хотите сохранить в объектах субграфа, обновив файл `schema.graphql`. - -Пример: - -```graphql -type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! -} -``` - -Эта схема определяет объект `MyTransfer` с такими полями, как `id`, `amount`, `source`, `designation` и `signers`. - -### Шаг 4: Обработка данных Субпотоков в `mappings.ts` - -With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. - -The example below demonstrates how to extract to Subgraph entities the non-derived transfers associated to the Orca account id: - -```ts -import { Protobuf } from 'as-proto/assembly' -import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' -import { MyTransfer } from '../generated/schema' - -export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) - - for (let i = 0; i < input.data.length; i++) { - const event = input.data[i] - - if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}` - const entity = new MyTransfer(entity_id) - entity.amount = event.transfer!.instruction!.amount.toString() - entity.source = event.transfer!.accounts!.source - entity.designation = event.transfer!.accounts!.destination - - if (event.transfer!.accounts!.signer!.single != null) { - entity.signers = [event.transfer!.accounts!.signer!.single!.signer] - } else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers - } - entity.save() - } - } -} -``` - -### Шаг 5: Сгенерируйте файлы Protobuf - -Чтобы сгенерировать объекты Protobuf в AssemblyScript, выполните следующую команду: - -```bash -npm run protogen -``` - -Эта команда преобразует определения Protobuf в AssemblyScript, позволяя использовать их в обработчике субграфа. - -### Заключение - -Поздравляем! Вы успешно настроили субграф на основе триггеров с поддержкой Субпотоков для токена Solana SPL. Следующий шаг Вы можете сделать, настроив схему, мэппинги и модули в соответствии со своим конкретным вариантом использования. - -### Video Tutorial - - - -### Дополнительные ресурсы - -Для более продвинутой настройки и оптимизации ознакомьтесь с официальной [документацией по Субпотокам](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/src/pages/sv/substreams/sps/_meta.js b/website/src/pages/sv/substreams/sps/_meta.js deleted file mode 100644 index 86fcd3df5ec0..000000000000 --- a/website/src/pages/sv/substreams/sps/_meta.js +++ /dev/null @@ -1,6 +0,0 @@ -export default { - introduction: '', - triggers: '', - tutorial: '', - faq: '', -} diff --git a/website/src/pages/sv/substreams/sps/faq.mdx b/website/src/pages/sv/substreams/sps/faq.mdx deleted file mode 100644 index e5313465d87c..000000000000 --- a/website/src/pages/sv/substreams/sps/faq.mdx +++ /dev/null @@ -1,96 +0,0 @@ ---- -title: Substreams-Powered Subgraphs FAQ -sidebarTitle: FAQ ---- - -## Vad är Substreams? - -Substreams is an exceptionally powerful processing engine capable of consuming rich streams of blockchain data. It allows you to refine and shape blockchain data for fast and seamless digestion by end-user applications. - -Specifically, it's a blockchain-agnostic, parallelized, and streaming-first engine that serves as a blockchain data transformation layer. It's powered by [Firehose](https://firehose.streamingfast.io/), and enables developers to write Rust modules, build upon community modules, provide extremely high-performance indexing, and [sink](/substreams/developing/sinks/) their data anywhere. - -Substreams is developed by [StreamingFast](https://www.streamingfast.io/). Visit the [Substreams Documentation](/substreams/introduction/) to learn more about Substreams. - -## What are Substreams-powered Subgraphs? - -[Substreams-powered Subgraphs](/sps/introduction/) combine the power of Substreams with the queryability of Subgraphs. When publishing a Substreams-powered Subgraph, the data produced by the Substreams transformations, can [output entity changes](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs) that are compatible with Subgraph entities. - -If you are already familiar with Subgraph development, note that Substreams-powered Subgraphs can then be queried just as if it had been produced by the AssemblyScript transformation layer. This provides all the benefits of Subgraphs, including a dynamic and flexible GraphQL API. - -## How are Substreams-powered Subgraphs different from Subgraphs? - -Subgraphs are made up of datasources which specify onchain events, and how those events should be transformed via handlers written in Assemblyscript. These events are processed sequentially, based on the order in which events happen onchain. - -By contrast, substreams-powered Subgraphs have a single datasource which references a substreams package, which is processed by the Graph Node. Substreams have access to additional granular onchain data compared to conventional Subgraphs, and can also benefit from massively parallelised processing, which can mean much faster processing times. - -## What are the benefits of using Substreams-powered Subgraphs? - -Substreams-powered Subgraphs combine all the benefits of Substreams with the queryability of Subgraphs. They bring greater composability and high-performance indexing to The Graph. They also enable new data use cases; for example, once you've built your Substreams-powered Subgraph, you can reuse your [Substreams modules](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) to output to different [sinks](https://substreams.streamingfast.io/reference-and-specs/manifests#sink) such as PostgreSQL, MongoDB, and Kafka. - -## Vilka fördelar har Substreams? - -Det finns många fördelar med att använda Substreams, inklusive: - -- Sammansättbarhet: Du kan stapla Substreams-moduler som LEGO-block och bygga på gemenskapsmoduler för att ytterligare förädla offentliga data. - -- Högpresterande indexering: Ordervärden snabbare indexering genom storskaliga kluster av parallella operationer (tänk BigQuery). - -- Sink anywhere: Sink your data to anywhere you want: PostgreSQL, MongoDB, Kafka, Subgraphs, flat files, Google Sheets. - -- Programmerbarhet: Använd kod för att anpassa extrahering, utföra transformationsbaserade aggregeringar och modellera din utdata för flera sänkar. - -- Tillgång till ytterligare data som inte är tillgänglig som en del av JSON RPC - -- Alla fördelar med Firehose. - -## Vad är Firehose? - -Utvecklat av [StreamingFast](https://www.streamingfast.io/), är Firehose ett blockkedjedata-extraktionslager som är utformat från grunden för att bearbeta blockkedjans fullständiga historik med hastigheter som tidigare inte var skådade. Genom att erbjuda en filbaserad och strömningsorienterad metod är det en kärnkomponent i StreamingFasts svit med öppen källkodstekniker och grunden för Substreams. - -Gå till [documentation](https://firehose.streamingfast.io/) för att lära dig mer om Firehose. - -## Vilka fördelar har Firehose? - -Det finns många fördelar med att använda Firehose, inklusive: - -- Lägsta latens och ingen avfrågning: I en strömningsorienterad stil är Firehose-noderna utformade för att snabbt skicka ut blockdata. - -- Förebygger driftstopp: Designat från grunden för hög tillgänglighet. - -- Missa aldrig en händelse: Firehose-strömmens markör är utformad för att hantera gafflar och att fortsätta där du avslutade under alla förhållanden. - -- Rikaste datamodell:  Bästa datamodell som inkluderar balansändringar, hela anropsträdet, interna transaktioner, loggar, lagringsändringar, gasavgifter och mer. - -- Använder platta filer: Blockkedjedata extraheras till platta filer, den billigaste och mest optimerade datorkällan som finns tillgänglig. - -## Where can developers access more information about Substreams-powered Subgraphs and Substreams? - -The [Substreams documentation](/substreams/introduction/) will teach you how to build Substreams modules. - -The [Substreams-powered Subgraphs documentation](/sps/introduction/) will show you how to package them for deployment on The Graph. - -The [latest Substreams Codegen tool](https://streamingfastio.medium.com/substreams-codegen-no-code-tool-to-bootstrap-your-project-a11efe0378c6) will allow you to bootstrap a Substreams project without any code. - -## Vad är rollen för Rust-moduler i Substreams? - -Rust modules are the equivalent of the AssemblyScript mappers in Subgraphs. They are compiled to WASM in a similar way, but the programming model allows for parallel execution. They define the sort of transformations and aggregations you want to apply to the raw blockchain data. - -See [modules documentation](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) for details. - -## Vad gör Substreams sammansättbart? - -Vid användning av Substreams sker sammansättningen på omvandlingsnivån, vilket gör att cachade moduler kan återanvändas. - -As an example, Alice can build a DEX price module, Bob can use it to build a volume aggregator for some tokens of his interest, and Lisa can combine four individual DEX price modules to create a price oracle. A single Substreams request will package all of these individual's modules, link them together, to offer a much more refined stream of data. That stream can then be used to populate a Subgraph, and be queried by consumers. - -## Hur kan man bygga och distribuera en Substreams-drivna subgraf? - -After [defining](/sps/introduction/) a Substreams-powered Subgraph, you can use the Graph CLI to deploy it in [Subgraph Studio](https://thegraph.com/studio/). - -## Where can I find examples of Substreams and Substreams-powered Subgraphs? - -You can visit [this Github repo](https://github.com/pinax-network/awesome-substreams) to find examples of Substreams and Substreams-powered Subgraphs. - -## What do Substreams and Substreams-powered Subgraphs mean for The Graph Network? - -Integrationen lovar många fördelar, inklusive extremt högpresterande indexering och ökad sammansättbarhet genom att dra nytta av gemenskapsmoduler och bygga vidare på dem. diff --git a/website/src/pages/sv/substreams/sps/introduction.mdx b/website/src/pages/sv/substreams/sps/introduction.mdx deleted file mode 100644 index 30e643fff68a..000000000000 --- a/website/src/pages/sv/substreams/sps/introduction.mdx +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: Introduction to Substreams-Powered Subgraphs -sidebarTitle: Introduktion ---- - -Boost your Subgraph's efficiency and scalability by using [Substreams](/substreams/introduction/) to stream pre-indexed blockchain data. - -## Översikt - -Use a Substreams package (`.spkg`) as a data source to give your Subgraph access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. - -### Specifics - -There are two methods of enabling this technology: - -1. **Using Substreams [triggers](/sps/triggers/)**: Consume from any Substreams module by importing the Protobuf model through a Subgraph handler and move all your logic into a Subgraph. This method creates the Subgraph entities directly in the Subgraph. - -2. **Using [Entity Changes](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out)**: By writing more of the logic into Substreams, you can consume the module's output directly into [graph-node](/indexing/tooling/graph-node/). In graph-node, you can use the Substreams data to create your Subgraph entities. - -You can choose where to place your logic, either in the Subgraph or Substreams. However, consider what aligns with your data needs, as Substreams has a parallelized model, and triggers are consumed linearly in the graph node. - -### Ytterligare resurser - -Visit the following links for tutorials on using code-generation tooling to build your first end-to-end Substreams project quickly: - -- [Solana](/substreams/developing/solana/transactions/) -- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm) -- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet) -- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective) -- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra) -- [Stellar](https://docs.substreams.dev/tutorials/intro-to-tutorials/stellar) diff --git a/website/src/pages/sv/substreams/sps/triggers.mdx b/website/src/pages/sv/substreams/sps/triggers.mdx deleted file mode 100644 index 77b382a28280..000000000000 --- a/website/src/pages/sv/substreams/sps/triggers.mdx +++ /dev/null @@ -1,47 +0,0 @@ ---- -title: Substreams Triggers ---- - -Use Custom Triggers and enable the full use GraphQL. - -## Översikt - -Custom Triggers allow you to send data directly into your Subgraph mappings file and entities, which are similar to tables and fields. This enables you to fully use the GraphQL layer. - -By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data in your Subgraph's handler. This ensures efficient and streamlined data management within the Subgraph framework. - -### Defining `handleTransactions` - -The following code demonstrates how to define a `handleTransactions` function in a Subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new Subgraph entity is created. - -```tsx -export function handleTransactions(bytes: Uint8Array): void { - let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).transactions // 1. - if (transactions.length == 0) { - log.info('No transactions found', []) - return - } - - for (let i = 0; i < transactions.length; i++) { - // 2. - let transaction = transactions[i] - - let entity = new Transaction(transaction.hash) // 3. - entity.from = transaction.from - entity.to = transaction.to - entity.save() - } -} -``` - -Here's what you're seeing in the `mappings.ts` file: - -1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object -2. Looping over the transactions -3. Create a new Subgraph entity for every transaction - -To go through a detailed example of a trigger-based Subgraph, [check out the tutorial](/sps/tutorial/). - -### Ytterligare resurser - -To scaffold your first project in the Development Container, check out one of the [How-To Guide](/substreams/developing/dev-container/). diff --git a/website/src/pages/sv/substreams/sps/tutorial.mdx b/website/src/pages/sv/substreams/sps/tutorial.mdx deleted file mode 100644 index 0aabe284b6d0..000000000000 --- a/website/src/pages/sv/substreams/sps/tutorial.mdx +++ /dev/null @@ -1,155 +0,0 @@ ---- -title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' -sidebarTitle: Tutorial ---- - -Successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. - -## Komma igång - -For a video tutorial, check out [How to Index Solana with a Substreams-powered Subgraph](/sps/tutorial/#video-tutorial) - -### Prerequisites - -Before starting, make sure to: - -- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. -- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. - -### Step 1: Initialize Your Project - -1. Open your Dev Container and run the following command to initialize your project: - - ```bash - substreams init - ``` - -2. Select the "minimal" project option. - -3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: - -```yaml -specVersion: v0.1.0 -package: - name: my_project_sol - version: v0.1.0 - -imports: # Pass your spkg of interest - solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg - -modules: - - name: map_spl_transfers - use: solana:map_block # Select corresponding modules available within your spkg - initialBlock: 260000082 - - - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes - -network: solana-mainnet-beta - -params: # Modify the param fields to meet your needs - # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA - map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE -``` - -### Step 2: Generate the Subgraph Manifest - -Once the project is initialized, generate a Subgraph manifest by running the following command in the Dev Container: - -```bash -substreams codegen subgraph -``` - -You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: - -```yaml ---- -dataSources: - - kind: substreams - name: my_project_sol - network: solana-mainnet-beta - source: - package: - moduleName: map_spl_transfers # Module defined in the substreams.yaml - file: ./my-project-sol-v0.1.0.spkg - mapping: - apiVersion: 0.0.9 - kind: substreams/graph-entities - file: ./src/mappings.ts - handler: handleTriggers -``` - -### Step 3: Define Entities in `schema.graphql` - -Define the fields you want to save in your Subgraph entities by updating the `schema.graphql` file. - -Here is an example: - -```graphql -type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! -} -``` - -This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. - -### Step 4: Handle Substreams Data in `mappings.ts` - -With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. - -The example below demonstrates how to extract to Subgraph entities the non-derived transfers associated to the Orca account id: - -```ts -import { Protobuf } from 'as-proto/assembly' -import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' -import { MyTransfer } from '../generated/schema' - -export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) - - for (let i = 0; i < input.data.length; i++) { - const event = input.data[i] - - if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}` - const entity = new MyTransfer(entity_id) - entity.amount = event.transfer!.instruction!.amount.toString() - entity.source = event.transfer!.accounts!.source - entity.designation = event.transfer!.accounts!.destination - - if (event.transfer!.accounts!.signer!.single != null) { - entity.signers = [event.transfer!.accounts!.signer!.single!.signer] - } else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers - } - entity.save() - } - } -} -``` - -### Step 5: Generate Protobuf Files - -To generate Protobuf objects in AssemblyScript, run the following command: - -```bash -npm run protogen -``` - -This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the Subgraph's handler. - -### Conclusion - -Congratulations! You've successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. You can take the next step by customizing your schema, mappings, and modules to fit your specific use case. - -### Video Tutorial - - - -### Ytterligare resurser - -For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/src/pages/tr/substreams/sps/_meta.js b/website/src/pages/tr/substreams/sps/_meta.js deleted file mode 100644 index 86fcd3df5ec0..000000000000 --- a/website/src/pages/tr/substreams/sps/_meta.js +++ /dev/null @@ -1,6 +0,0 @@ -export default { - introduction: '', - triggers: '', - tutorial: '', - faq: '', -} diff --git a/website/src/pages/tr/substreams/sps/faq.mdx b/website/src/pages/tr/substreams/sps/faq.mdx deleted file mode 100644 index 30401c2c76bd..000000000000 --- a/website/src/pages/tr/substreams/sps/faq.mdx +++ /dev/null @@ -1,96 +0,0 @@ ---- -title: Substreams-Powered Subgraphs FAQ -sidebarTitle: FAQ ---- - -## Substreams nedir? - -Substreams is an exceptionally powerful processing engine capable of consuming rich streams of blockchain data. It allows you to refine and shape blockchain data for fast and seamless digestion by end-user applications. - -Specifically, it's a blockchain-agnostic, parallelized, and streaming-first engine that serves as a blockchain data transformation layer. It's powered by [Firehose](https://firehose.streamingfast.io/), and enables developers to write Rust modules, build upon community modules, provide extremely high-performance indexing, and [sink](/substreams/developing/sinks/) their data anywhere. - -Substreams is developed by [StreamingFast](https://www.streamingfast.io/). Visit the [Substreams Documentation](/substreams/introduction/) to learn more about Substreams. - -## What are Substreams-powered Subgraphs? - -[Substreams-powered Subgraphs](/sps/introduction/) combine the power of Substreams with the queryability of Subgraphs. When publishing a Substreams-powered Subgraph, the data produced by the Substreams transformations, can [output entity changes](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs) that are compatible with Subgraph entities. - -If you are already familiar with Subgraph development, note that Substreams-powered Subgraphs can then be queried just as if it had been produced by the AssemblyScript transformation layer. This provides all the benefits of Subgraphs, including a dynamic and flexible GraphQL API. - -## How are Substreams-powered Subgraphs different from Subgraphs? - -Subgraphs are made up of datasources which specify onchain events, and how those events should be transformed via handlers written in Assemblyscript. These events are processed sequentially, based on the order in which events happen onchain. - -By contrast, substreams-powered Subgraphs have a single datasource which references a substreams package, which is processed by the Graph Node. Substreams have access to additional granular onchain data compared to conventional Subgraphs, and can also benefit from massively parallelised processing, which can mean much faster processing times. - -## What are the benefits of using Substreams-powered Subgraphs? - -Substreams-powered Subgraphs combine all the benefits of Substreams with the queryability of Subgraphs. They bring greater composability and high-performance indexing to The Graph. They also enable new data use cases; for example, once you've built your Substreams-powered Subgraph, you can reuse your [Substreams modules](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) to output to different [sinks](https://substreams.streamingfast.io/reference-and-specs/manifests#sink) such as PostgreSQL, MongoDB, and Kafka. - -## Substreams'in faydaları nelerdir? - -Substreams kullanmanın birçok faydası vardır, bunlar şunlardır: - -- Birleştirilebilir: Substreams modüllerini LEGO blokları gibi birleştirebilir ve topluluk modüllerine dayanarak açık verileri daha da ayrıntılayabilirsiniz. - -- Yüksek performanslı indeksleme: Büyük ölçekli paralel işlemler sayesinde sıradan işlemlere göre onlarca kat daha hızlı indeksleme sağlar (BigQuery gibi). - -- Sink anywhere: Sink your data to anywhere you want: PostgreSQL, MongoDB, Kafka, Subgraphs, flat files, Google Sheets. - -- Programlanabilir: Kod kullanarak çıkarma işlemlerini özelleştirmek, dönüşüm zamanında toplamalar yapmak ve çıktınızı birden çok hedef için modelleyebilirsiniz. - -- JSON RPC'nin parçası olmayan ek verilere erişim sağlar - -- Firehose'un tüm faydalarından yararlanır. - -## Firehose nedir? - -[StreamingFast](https://www.streamingfast.io/) tarafından geliştirilen Firehose, daha önce görülmemiş hızlarda blok zincirinin baştan sona, tam geçmişini işlemek için tasarlanmış bir blok zinciri veri çıkarma katmanıdır. Dosya tabanlı ve akışa odaklı bir yaklaşım sunarak, StreamingFast'in açık kaynaklı teknolojilerinin temel bileşenlerinden biridir ve Substreamler'in temelini oluşturur. - -Firehose hakkında daha fazla bilgi için [documentation](https://firehose.streamingfast.io/) gidin. - -## Firehose'un faydaları nelerdir? - -Firehose kullanmanın birçok faydası vardır, bunlar şunlardır: - -- En düşük gecikme ve sorgulama yok: Akışa odaklı bir şekilde, Firehose düğümleri blok verilerini ilk olarak dışarıya göndermek üzere tasarlanmıştır. - -- Kesintisiz çalışma: Yüksek Erişilebilirlik için baştan sona tasarlanmıştır. - -- Hiçbir şeyi kaçırmaz: Firehose akış imleci, fork durumlarını ele almak ve herhangi bir durumda kaldığınız yerden devam etmek için tasarlanmıştır. - -- En zengin veri modeli: Bakiye değişikliklerini, tam çağrı ağacını, dahili işlemleri, kayıtları, depolama değişikliklerini, gaz maliyetlerini ve daha fazlasını içeren en iyi veri modeli. - -- Düz dosyalardan yararlanma: Blok zinciri verileri düz dosyalara çıkarılır, en ucuz ve en optimize hesaplama kaynağı kullanılır. - -## Where can developers access more information about Substreams-powered Subgraphs and Substreams? - -The [Substreams documentation](/substreams/introduction/) will teach you how to build Substreams modules. - -The [Substreams-powered Subgraphs documentation](/sps/introduction/) will show you how to package them for deployment on The Graph. - -[En son sürüm Substreams Codegen aracı](https://streamingfastio.medium.com/substreams-codegen-no-code-tool-to-bootstrap-your-project-a11efe0378c6), hiç kod yazmadan bir Substreams projesi başlatmanıza olanak tanır. - -## Rust modüllerinin Substreams içindeki rolü nedir? - -Rust modules are the equivalent of the AssemblyScript mappers in Subgraphs. They are compiled to WASM in a similar way, but the programming model allows for parallel execution. They define the sort of transformations and aggregations you want to apply to the raw blockchain data. - -See [modules documentation](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) for details. - -## Substreams'i birleştirilebilir yapan nedir? - -Substream kullanırken, kompozisyon dönüşüm katmanında gerçekleşir ve önbelleğe alınmış modüllerin tekrar kullanılmasına olanak sağlar. - -As an example, Alice can build a DEX price module, Bob can use it to build a volume aggregator for some tokens of his interest, and Lisa can combine four individual DEX price modules to create a price oracle. A single Substreams request will package all of these individual's modules, link them together, to offer a much more refined stream of data. That stream can then be used to populate a Subgraph, and be queried by consumers. - -## Bir Substreams destekli Subgraph nasıl oluşturulur ve dağıtılır? - -After [defining](/sps/introduction/) a Substreams-powered Subgraph, you can use the Graph CLI to deploy it in [Subgraph Studio](https://thegraph.com/studio/). - -## Where can I find examples of Substreams and Substreams-powered Subgraphs? - -You can visit [this Github repo](https://github.com/pinax-network/awesome-substreams) to find examples of Substreams and Substreams-powered Subgraphs. - -## What do Substreams and Substreams-powered Subgraphs mean for The Graph Network? - -Bu entegrasyon, topluluk modüllerinden yararlanarak son derece yüksek performanslı indeksleme ve daha fazla birleştirme yapma avantajları sunar. diff --git a/website/src/pages/tr/substreams/sps/introduction.mdx b/website/src/pages/tr/substreams/sps/introduction.mdx deleted file mode 100644 index 4df56659c9f8..000000000000 --- a/website/src/pages/tr/substreams/sps/introduction.mdx +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: Introduction to Substreams-Powered Subgraphs -sidebarTitle: Giriş ---- - -Boost your Subgraph's efficiency and scalability by using [Substreams](/substreams/introduction/) to stream pre-indexed blockchain data. - -## Genel Bakış - -Use a Substreams package (`.spkg`) as a data source to give your Subgraph access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. - -### Ayrıntılar - -There are two methods of enabling this technology: - -1. **Using Substreams [triggers](/sps/triggers/)**: Consume from any Substreams module by importing the Protobuf model through a Subgraph handler and move all your logic into a Subgraph. This method creates the Subgraph entities directly in the Subgraph. - -2. **Using [Entity Changes](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out)**: By writing more of the logic into Substreams, you can consume the module's output directly into [graph-node](/indexing/tooling/graph-node/). In graph-node, you can use the Substreams data to create your Subgraph entities. - -You can choose where to place your logic, either in the Subgraph or Substreams. However, consider what aligns with your data needs, as Substreams has a parallelized model, and triggers are consumed linearly in the graph node. - -### Ek Kaynaklar - -Visit the following links for tutorials on using code-generation tooling to build your first end-to-end Substreams project quickly: - -- [Solana](/substreams/developing/solana/transactions/) -- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm) -- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet) -- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective) -- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra) -- [Stellar](https://docs.substreams.dev/tutorials/intro-to-tutorials/stellar) diff --git a/website/src/pages/tr/substreams/sps/triggers.mdx b/website/src/pages/tr/substreams/sps/triggers.mdx deleted file mode 100644 index 648b624258e3..000000000000 --- a/website/src/pages/tr/substreams/sps/triggers.mdx +++ /dev/null @@ -1,47 +0,0 @@ ---- -title: Substreams Triggers ---- - -Use Custom Triggers and enable the full use GraphQL. - -## Genel Bakış - -Custom Triggers allow you to send data directly into your Subgraph mappings file and entities, which are similar to tables and fields. This enables you to fully use the GraphQL layer. - -By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data in your Subgraph's handler. This ensures efficient and streamlined data management within the Subgraph framework. - -### Defining `handleTransactions` - -The following code demonstrates how to define a `handleTransactions` function in a Subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new Subgraph entity is created. - -```tsx -export function handleTransactions(bytes: Uint8Array): void { - let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).transactions // 1. - if (transactions.length == 0) { - log.info('No transactions found', []) - return - } - - for (let i = 0; i < transactions.length; i++) { - // 2. - let transaction = transactions[i] - - let entity = new Transaction(transaction.hash) // 3. - entity.from = transaction.from - entity.to = transaction.to - entity.save() - } -} -``` - -Here's what you're seeing in the `mappings.ts` file: - -1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object -2. Looping over the transactions -3. Create a new Subgraph entity for every transaction - -To go through a detailed example of a trigger-based Subgraph, [check out the tutorial](/sps/tutorial/). - -### Ek Kaynaklar - -To scaffold your first project in the Development Container, check out one of the [How-To Guide](/substreams/developing/dev-container/). diff --git a/website/src/pages/tr/substreams/sps/tutorial.mdx b/website/src/pages/tr/substreams/sps/tutorial.mdx deleted file mode 100644 index 1d5d17bc712f..000000000000 --- a/website/src/pages/tr/substreams/sps/tutorial.mdx +++ /dev/null @@ -1,155 +0,0 @@ ---- -title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' -sidebarTitle: Tutorial ---- - -Successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. - -## Başlayalım - -For a video tutorial, check out [How to Index Solana with a Substreams-powered Subgraph](/sps/tutorial/#video-tutorial) - -### Prerequisites - -Before starting, make sure to: - -- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. -- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. - -### Step 1: Initialize Your Project - -1. Open your Dev Container and run the following command to initialize your project: - - ```bash - substreams init - ``` - -2. Select the "minimal" project option. - -3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: - -```yaml -specVersion: v0.1.0 -package: - name: my_project_sol - version: v0.1.0 - -imports: # Pass your spkg of interest - solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg - -modules: - - name: map_spl_transfers - use: solana:map_block # Select corresponding modules available within your spkg - initialBlock: 260000082 - - - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes - -network: solana-mainnet-beta - -params: # Modify the param fields to meet your needs - # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA - map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE -``` - -### Step 2: Generate the Subgraph Manifest - -Once the project is initialized, generate a Subgraph manifest by running the following command in the Dev Container: - -```bash -substreams codegen subgraph -``` - -You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: - -```yaml ---- -dataSources: - - kind: substreams - name: my_project_sol - network: solana-mainnet-beta - source: - package: - moduleName: map_spl_transfers # Module defined in the substreams.yaml - file: ./my-project-sol-v0.1.0.spkg - mapping: - apiVersion: 0.0.9 - kind: substreams/graph-entities - file: ./src/mappings.ts - handler: handleTriggers -``` - -### Step 3: Define Entities in `schema.graphql` - -Define the fields you want to save in your Subgraph entities by updating the `schema.graphql` file. - -Here is an example: - -```graphql -type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! -} -``` - -This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. - -### Step 4: Handle Substreams Data in `mappings.ts` - -With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. - -The example below demonstrates how to extract to Subgraph entities the non-derived transfers associated to the Orca account id: - -```ts -import { Protobuf } from 'as-proto/assembly' -import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' -import { MyTransfer } from '../generated/schema' - -export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) - - for (let i = 0; i < input.data.length; i++) { - const event = input.data[i] - - if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}` - const entity = new MyTransfer(entity_id) - entity.amount = event.transfer!.instruction!.amount.toString() - entity.source = event.transfer!.accounts!.source - entity.designation = event.transfer!.accounts!.destination - - if (event.transfer!.accounts!.signer!.single != null) { - entity.signers = [event.transfer!.accounts!.signer!.single!.signer] - } else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers - } - entity.save() - } - } -} -``` - -### Step 5: Generate Protobuf Files - -To generate Protobuf objects in AssemblyScript, run the following command: - -```bash -npm run protogen -``` - -This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the Subgraph's handler. - -### Sonuç - -Congratulations! You've successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. You can take the next step by customizing your schema, mappings, and modules to fit your specific use case. - -### Video Tutorial - - - -### Ek Kaynaklar - -For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/src/pages/uk/substreams/sps/_meta.js b/website/src/pages/uk/substreams/sps/_meta.js deleted file mode 100644 index 86fcd3df5ec0..000000000000 --- a/website/src/pages/uk/substreams/sps/_meta.js +++ /dev/null @@ -1,6 +0,0 @@ -export default { - introduction: '', - triggers: '', - tutorial: '', - faq: '', -} diff --git a/website/src/pages/uk/substreams/sps/faq.mdx b/website/src/pages/uk/substreams/sps/faq.mdx deleted file mode 100644 index 250c466d5929..000000000000 --- a/website/src/pages/uk/substreams/sps/faq.mdx +++ /dev/null @@ -1,96 +0,0 @@ ---- -title: Substreams-Powered Subgraphs FAQ -sidebarTitle: FAQ ---- - -## What are Substreams? - -Substreams is an exceptionally powerful processing engine capable of consuming rich streams of blockchain data. It allows you to refine and shape blockchain data for fast and seamless digestion by end-user applications. - -Specifically, it's a blockchain-agnostic, parallelized, and streaming-first engine that serves as a blockchain data transformation layer. It's powered by [Firehose](https://firehose.streamingfast.io/), and enables developers to write Rust modules, build upon community modules, provide extremely high-performance indexing, and [sink](/substreams/developing/sinks/) their data anywhere. - -Substreams is developed by [StreamingFast](https://www.streamingfast.io/). Visit the [Substreams Documentation](/substreams/introduction/) to learn more about Substreams. - -## What are Substreams-powered Subgraphs? - -[Substreams-powered Subgraphs](/sps/introduction/) combine the power of Substreams with the queryability of Subgraphs. When publishing a Substreams-powered Subgraph, the data produced by the Substreams transformations, can [output entity changes](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs) that are compatible with Subgraph entities. - -If you are already familiar with Subgraph development, note that Substreams-powered Subgraphs can then be queried just as if it had been produced by the AssemblyScript transformation layer. This provides all the benefits of Subgraphs, including a dynamic and flexible GraphQL API. - -## How are Substreams-powered Subgraphs different from Subgraphs? - -Subgraphs are made up of datasources which specify onchain events, and how those events should be transformed via handlers written in Assemblyscript. These events are processed sequentially, based on the order in which events happen onchain. - -By contrast, substreams-powered Subgraphs have a single datasource which references a substreams package, which is processed by the Graph Node. Substreams have access to additional granular onchain data compared to conventional Subgraphs, and can also benefit from massively parallelised processing, which can mean much faster processing times. - -## What are the benefits of using Substreams-powered Subgraphs? - -Substreams-powered Subgraphs combine all the benefits of Substreams with the queryability of Subgraphs. They bring greater composability and high-performance indexing to The Graph. They also enable new data use cases; for example, once you've built your Substreams-powered Subgraph, you can reuse your [Substreams modules](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) to output to different [sinks](https://substreams.streamingfast.io/reference-and-specs/manifests#sink) such as PostgreSQL, MongoDB, and Kafka. - -## What are the benefits of Substreams? - -There are many benefits to using Substreams, including: - -- Composable: You can stack Substreams modules like LEGO blocks, and build upon community modules, further refining public data. - -- High-performance indexing: Orders of magnitude faster indexing through large-scale clusters of parallel operations (think BigQuery). - -- Sink anywhere: Sink your data to anywhere you want: PostgreSQL, MongoDB, Kafka, Subgraphs, flat files, Google Sheets. - -- Programmable: Use code to customize extraction, do transformation-time aggregations, and model your output for multiple sinks. - -- Access to additional data which is not available as part of the JSON RPC - -- All the benefits of the Firehose. - -## What is the Firehose? - -Developed by [StreamingFast](https://www.streamingfast.io/), the Firehose is a blockchain data extraction layer designed from scratch to process the full history of blockchains at speeds that were previously unseen. Providing a files-based and streaming-first approach, it is a core component of StreamingFast's suite of open-source technologies and the foundation for Substreams. - -Go to the [documentation](https://firehose.streamingfast.io/) to learn more about the Firehose. - -## What are the benefits of the Firehose? - -There are many benefits to using Firehose, including: - -- Lowest latency & no polling: In a streaming-first fashion, the Firehose nodes are designed to race to push out the block data first. - -- Prevents downtimes: Designed from the ground up for High Availability. - -- Never miss a beat: The Firehose stream cursor is designed to handle forks and to continue where you left off in any condition. - -- Richest data model:  Best data model that includes the balance changes, the full call tree, internal transactions, logs, storage changes, gas costs, and more. - -- Leverages flat files: Blockchain data is extracted into flat files, the cheapest and most optimized computing resource available. - -## Where can developers access more information about Substreams-powered Subgraphs and Substreams? - -The [Substreams documentation](/substreams/introduction/) will teach you how to build Substreams modules. - -The [Substreams-powered Subgraphs documentation](/sps/introduction/) will show you how to package them for deployment on The Graph. - -The [latest Substreams Codegen tool](https://streamingfastio.medium.com/substreams-codegen-no-code-tool-to-bootstrap-your-project-a11efe0378c6) will allow you to bootstrap a Substreams project without any code. - -## What is the role of Rust modules in Substreams? - -Rust modules are the equivalent of the AssemblyScript mappers in Subgraphs. They are compiled to WASM in a similar way, but the programming model allows for parallel execution. They define the sort of transformations and aggregations you want to apply to the raw blockchain data. - -See [modules documentation](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) for details. - -## What makes Substreams composable? - -When using Substreams, the composition happens at the transformation layer enabling cached modules to be re-used. - -As an example, Alice can build a DEX price module, Bob can use it to build a volume aggregator for some tokens of his interest, and Lisa can combine four individual DEX price modules to create a price oracle. A single Substreams request will package all of these individual's modules, link them together, to offer a much more refined stream of data. That stream can then be used to populate a Subgraph, and be queried by consumers. - -## How can you build and deploy a Substreams-powered Subgraph? - -After [defining](/sps/introduction/) a Substreams-powered Subgraph, you can use the Graph CLI to deploy it in [Subgraph Studio](https://thegraph.com/studio/). - -## Where can I find examples of Substreams and Substreams-powered Subgraphs? - -You can visit [this Github repo](https://github.com/pinax-network/awesome-substreams) to find examples of Substreams and Substreams-powered Subgraphs. - -## What do Substreams and Substreams-powered Subgraphs mean for The Graph Network? - -The integration promises many benefits, including extremely high-performance indexing and greater composability by leveraging community modules and building on them. diff --git a/website/src/pages/uk/substreams/sps/introduction.mdx b/website/src/pages/uk/substreams/sps/introduction.mdx deleted file mode 100644 index 8a801f1a048a..000000000000 --- a/website/src/pages/uk/substreams/sps/introduction.mdx +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: Introduction to Substreams-Powered Subgraphs -sidebarTitle: Introduction ---- - -Boost your Subgraph's efficiency and scalability by using [Substreams](/substreams/introduction/) to stream pre-indexed blockchain data. - -## Overview - -Use a Substreams package (`.spkg`) as a data source to give your Subgraph access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. - -### Specifics - -There are two methods of enabling this technology: - -1. **Using Substreams [triggers](/sps/triggers/)**: Consume from any Substreams module by importing the Protobuf model through a Subgraph handler and move all your logic into a Subgraph. This method creates the Subgraph entities directly in the Subgraph. - -2. **Using [Entity Changes](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out)**: By writing more of the logic into Substreams, you can consume the module's output directly into [graph-node](/indexing/tooling/graph-node/). In graph-node, you can use the Substreams data to create your Subgraph entities. - -You can choose where to place your logic, either in the Subgraph or Substreams. However, consider what aligns with your data needs, as Substreams has a parallelized model, and triggers are consumed linearly in the graph node. - -### Додаткові матеріали - -Visit the following links for tutorials on using code-generation tooling to build your first end-to-end Substreams project quickly: - -- [Solana](/substreams/developing/solana/transactions/) -- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm) -- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet) -- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective) -- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra) -- [Stellar](https://docs.substreams.dev/tutorials/intro-to-tutorials/stellar) diff --git a/website/src/pages/uk/substreams/sps/triggers.mdx b/website/src/pages/uk/substreams/sps/triggers.mdx deleted file mode 100644 index 87181f9bd72d..000000000000 --- a/website/src/pages/uk/substreams/sps/triggers.mdx +++ /dev/null @@ -1,47 +0,0 @@ ---- -title: Substreams Triggers ---- - -Use Custom Triggers and enable the full use GraphQL. - -## Overview - -Custom Triggers allow you to send data directly into your Subgraph mappings file and entities, which are similar to tables and fields. This enables you to fully use the GraphQL layer. - -By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data in your Subgraph's handler. This ensures efficient and streamlined data management within the Subgraph framework. - -### Defining `handleTransactions` - -The following code demonstrates how to define a `handleTransactions` function in a Subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new Subgraph entity is created. - -```tsx -export function handleTransactions(bytes: Uint8Array): void { - let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).transactions // 1. - if (transactions.length == 0) { - log.info('No transactions found', []) - return - } - - for (let i = 0; i < transactions.length; i++) { - // 2. - let transaction = transactions[i] - - let entity = new Transaction(transaction.hash) // 3. - entity.from = transaction.from - entity.to = transaction.to - entity.save() - } -} -``` - -Here's what you're seeing in the `mappings.ts` file: - -1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object -2. Looping over the transactions -3. Create a new Subgraph entity for every transaction - -To go through a detailed example of a trigger-based Subgraph, [check out the tutorial](/sps/tutorial/). - -### Додаткові матеріали - -To scaffold your first project in the Development Container, check out one of the [How-To Guide](/substreams/developing/dev-container/). diff --git a/website/src/pages/uk/substreams/sps/tutorial.mdx b/website/src/pages/uk/substreams/sps/tutorial.mdx deleted file mode 100644 index 6b611ef2c923..000000000000 --- a/website/src/pages/uk/substreams/sps/tutorial.mdx +++ /dev/null @@ -1,155 +0,0 @@ ---- -title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' -sidebarTitle: Tutorial ---- - -Successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. - -## Розпочати роботу - -For a video tutorial, check out [How to Index Solana with a Substreams-powered Subgraph](/sps/tutorial/#video-tutorial) - -### Prerequisites - -Before starting, make sure to: - -- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. -- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. - -### Step 1: Initialize Your Project - -1. Open your Dev Container and run the following command to initialize your project: - - ```bash - substreams init - ``` - -2. Select the "minimal" project option. - -3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: - -```yaml -specVersion: v0.1.0 -package: - name: my_project_sol - version: v0.1.0 - -imports: # Pass your spkg of interest - solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg - -modules: - - name: map_spl_transfers - use: solana:map_block # Select corresponding modules available within your spkg - initialBlock: 260000082 - - - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes - -network: solana-mainnet-beta - -params: # Modify the param fields to meet your needs - # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA - map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE -``` - -### Step 2: Generate the Subgraph Manifest - -Once the project is initialized, generate a Subgraph manifest by running the following command in the Dev Container: - -```bash -substreams codegen subgraph -``` - -You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: - -```yaml ---- -dataSources: - - kind: substreams - name: my_project_sol - network: solana-mainnet-beta - source: - package: - moduleName: map_spl_transfers # Module defined in the substreams.yaml - file: ./my-project-sol-v0.1.0.spkg - mapping: - apiVersion: 0.0.9 - kind: substreams/graph-entities - file: ./src/mappings.ts - handler: handleTriggers -``` - -### Step 3: Define Entities in `schema.graphql` - -Define the fields you want to save in your Subgraph entities by updating the `schema.graphql` file. - -Here is an example: - -```graphql -type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! -} -``` - -This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. - -### Step 4: Handle Substreams Data in `mappings.ts` - -With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. - -The example below demonstrates how to extract to Subgraph entities the non-derived transfers associated to the Orca account id: - -```ts -import { Protobuf } from 'as-proto/assembly' -import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' -import { MyTransfer } from '../generated/schema' - -export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) - - for (let i = 0; i < input.data.length; i++) { - const event = input.data[i] - - if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}` - const entity = new MyTransfer(entity_id) - entity.amount = event.transfer!.instruction!.amount.toString() - entity.source = event.transfer!.accounts!.source - entity.designation = event.transfer!.accounts!.destination - - if (event.transfer!.accounts!.signer!.single != null) { - entity.signers = [event.transfer!.accounts!.signer!.single!.signer] - } else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers - } - entity.save() - } - } -} -``` - -### Step 5: Generate Protobuf Files - -To generate Protobuf objects in AssemblyScript, run the following command: - -```bash -npm run protogen -``` - -This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the Subgraph's handler. - -### Conclusion - -Congratulations! You've successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. You can take the next step by customizing your schema, mappings, and modules to fit your specific use case. - -### Video Tutorial - - - -### Додаткові матеріали - -For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/src/pages/ur/substreams/sps/_meta.js b/website/src/pages/ur/substreams/sps/_meta.js deleted file mode 100644 index 86fcd3df5ec0..000000000000 --- a/website/src/pages/ur/substreams/sps/_meta.js +++ /dev/null @@ -1,6 +0,0 @@ -export default { - introduction: '', - triggers: '', - tutorial: '', - faq: '', -} diff --git a/website/src/pages/ur/substreams/sps/faq.mdx b/website/src/pages/ur/substreams/sps/faq.mdx deleted file mode 100644 index 292390a34142..000000000000 --- a/website/src/pages/ur/substreams/sps/faq.mdx +++ /dev/null @@ -1,96 +0,0 @@ ---- -title: Substreams-Powered Subgraphs FAQ -sidebarTitle: FAQ ---- - -## سب اسٹریمز کیا ہیں؟ - -Substreams is an exceptionally powerful processing engine capable of consuming rich streams of blockchain data. It allows you to refine and shape blockchain data for fast and seamless digestion by end-user applications. - -Specifically, it's a blockchain-agnostic, parallelized, and streaming-first engine that serves as a blockchain data transformation layer. It's powered by [Firehose](https://firehose.streamingfast.io/), and enables developers to write Rust modules, build upon community modules, provide extremely high-performance indexing, and [sink](/substreams/developing/sinks/) their data anywhere. - -Substreams is developed by [StreamingFast](https://www.streamingfast.io/). Visit the [Substreams Documentation](/substreams/introduction/) to learn more about Substreams. - -## What are Substreams-powered Subgraphs? - -[Substreams-powered Subgraphs](/sps/introduction/) combine the power of Substreams with the queryability of Subgraphs. When publishing a Substreams-powered Subgraph, the data produced by the Substreams transformations, can [output entity changes](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs) that are compatible with Subgraph entities. - -If you are already familiar with Subgraph development, note that Substreams-powered Subgraphs can then be queried just as if it had been produced by the AssemblyScript transformation layer. This provides all the benefits of Subgraphs, including a dynamic and flexible GraphQL API. - -## How are Substreams-powered Subgraphs different from Subgraphs? - -Subgraphs are made up of datasources which specify onchain events, and how those events should be transformed via handlers written in Assemblyscript. These events are processed sequentially, based on the order in which events happen onchain. - -By contrast, substreams-powered Subgraphs have a single datasource which references a substreams package, which is processed by the Graph Node. Substreams have access to additional granular onchain data compared to conventional Subgraphs, and can also benefit from massively parallelised processing, which can mean much faster processing times. - -## What are the benefits of using Substreams-powered Subgraphs? - -Substreams-powered Subgraphs combine all the benefits of Substreams with the queryability of Subgraphs. They bring greater composability and high-performance indexing to The Graph. They also enable new data use cases; for example, once you've built your Substreams-powered Subgraph, you can reuse your [Substreams modules](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) to output to different [sinks](https://substreams.streamingfast.io/reference-and-specs/manifests#sink) such as PostgreSQL, MongoDB, and Kafka. - -## سب سٹریمز کے فوائد کہاں ہیں؟ - -سب سٹریمز کو استعمال کرنے کے بہت سے فوائد ہیں، بشمول: - -- کمپوز ایبل: آپ سب سٹریمز ماڈیولز جیسے LEGO بلاکس کو اسٹیک کر سکتے ہیں، اور عوامی ڈیٹا کو مزید بہتر کرتے ہوئے کمیونٹی ماڈیول بنا سکتے ہیں. - -- اعلی کارکردگی کی انڈیکسنگ: متوازی کارروائیوں کے بڑے پیمانے پر کلسٹرز کے ذریعے تیز تر انڈیکسنگ کے آرڈرز (سوچیں BigQuery). - -- Sink anywhere: Sink your data to anywhere you want: PostgreSQL, MongoDB, Kafka, Subgraphs, flat files, Google Sheets. - -- قابل پروگرام: اسے اپنی مرضی کے مطابق بنانے کے لیے کوڈ کریں، دو ٹرانسفارم ٹائم ایگریگیشنز، اور متعدد حواس کے لیے اپنے آؤٹ پٹ کو ماڈل کریں. - -- اضافی ڈیٹا تک رسائی جو JSON RPC کے حصے کے طور پر دستیاب نہیں ہے - -- Firehose کے تمام فوائد. - -## Firehose کیا ہے؟ - -[StreamingFast](https://www.streamingfast.io/) کے ذریعے تیار کردہ، Firehose ایک بلاکچین ڈیٹا نکالنے کی پرت ہے جسے شروع سے بلاکچینز کی مکمل تاریخ کو اس رفتار سے پروسیس کرنے کے لیے ڈیزائن کیا گیا ہے جو پہلے نظر نہیں آتی تھیں۔ فائلوں پر مبنی اور سٹریمنگ فرسٹ اپروچ فراہم کرنا، یہ سٹریمنگ فاسٹ کے اوپن سورس ٹیکنالوجیز کے سوٹ کا بنیادی جزو اور سب اسٹریمز کی بنیاد ہے. - -Firehose کے بارے میں مزید جاننے کے لیے[documentation] (https://firehose.streamingfast.io/) پر جائیں. - -## Firehose کے کیا فوائد ہیں؟ - -Firehose استعمال کرنے کے بہت سے فوائد ہیں، بشمول: - -- سب سے کم تاخیر اور کوئی پولنگ نہیں: اسٹریمنگ کے پہلے انداز میں، Firehose نوڈس کو پہلے بلاک ڈیٹا کو آگے بڑھانے کی دوڑ کے لیے ڈیزائن کیا گیا ہے. - -- ڈاؤن ٹائمز کو روکتا ہے: اعلی دستیابی کے لیے زمین سے ڈیزائن کیا گیا ہے. - -- کبھی بھی بیٹ مت چھوڑیں: Firehose سٹریم کرسر کو فورکس ہینڈل کرنے اور کسی بھی حالت میں وہیں سے جاری رکھنے کے لیے بنایا گیا ہے جہاں آپ نے چھوڑا تھا. - -- امیرترین ڈیٹا ماڈل: بہترین ڈیٹا ماڈل جس میں بیلنس کی تبدیلیاں، مکمل کال ٹری، اندرونی ٹرانزیکشن، لاگز، اسٹوریج کی تبدیلیاں، گیس کی قیمتیں اور بہت کچھ شامل ہے. - -- فلیٹ فائلوں کا فائدہ اٹھاتا ہے: بلاکچین ڈیٹا کو فلیٹ فائلوں میں نکالا جاتا ہے، جو دستیاب سب سے سستا اور بہترین کمپیوٹنگ وسیلہ ہے. - -## Where can developers access more information about Substreams-powered Subgraphs and Substreams? - -The [Substreams documentation](/substreams/introduction/) will teach you how to build Substreams modules. - -The [Substreams-powered Subgraphs documentation](/sps/introduction/) will show you how to package them for deployment on The Graph. - -The [latest Substreams Codegen tool](https://streamingfastio.medium.com/substreams-codegen-no-code-tool-to-bootstrap-your-project-a11efe0378c6) will allow you to bootstrap a Substreams project without any code. - -## سب سٹریمز میں Rust ماڈیولز کا کیا کردار ہے؟ - -Rust modules are the equivalent of the AssemblyScript mappers in Subgraphs. They are compiled to WASM in a similar way, but the programming model allows for parallel execution. They define the sort of transformations and aggregations you want to apply to the raw blockchain data. - -See [modules documentation](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) for details. - -## سب سٹریمز کو کمپوز ایبل کیا بناتا ہے؟ - -سب سٹریمز کا استعمال کرتے وقت، کمپوزیشن ٹرانسفارمیشن لیئر پر ہوتی ہے جو کیشڈ ماڈیولز کو دوبارہ استعمال کرنے کے قابل بناتی ہے. - -As an example, Alice can build a DEX price module, Bob can use it to build a volume aggregator for some tokens of his interest, and Lisa can combine four individual DEX price modules to create a price oracle. A single Substreams request will package all of these individual's modules, link them together, to offer a much more refined stream of data. That stream can then be used to populate a Subgraph, and be queried by consumers. - -## آپ سب سٹریمز سے چلنے والے سب گراف کو کیسے بنا اور تعینات کر سکتے ہیں؟ - -After [defining](/sps/introduction/) a Substreams-powered Subgraph, you can use the Graph CLI to deploy it in [Subgraph Studio](https://thegraph.com/studio/). - -## Where can I find examples of Substreams and Substreams-powered Subgraphs? - -You can visit [this Github repo](https://github.com/pinax-network/awesome-substreams) to find examples of Substreams and Substreams-powered Subgraphs. - -## What do Substreams and Substreams-powered Subgraphs mean for The Graph Network? - -انضمام بہت سے فوائد کا وعدہ کرتا ہے، بشمول انتہائی اعلی کارکردگی کی انڈیکسنگ اور کمیونٹی ماڈیولز کا فائدہ اٹھا کر اور ان پر تعمیر کرنے کے ذریعے زیادہ کمپوز ایبلٹی. diff --git a/website/src/pages/ur/substreams/sps/introduction.mdx b/website/src/pages/ur/substreams/sps/introduction.mdx deleted file mode 100644 index b98518e49e1d..000000000000 --- a/website/src/pages/ur/substreams/sps/introduction.mdx +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: Introduction to Substreams-Powered Subgraphs -sidebarTitle: تعارف ---- - -Boost your Subgraph's efficiency and scalability by using [Substreams](/substreams/introduction/) to stream pre-indexed blockchain data. - -## جائزہ - -Use a Substreams package (`.spkg`) as a data source to give your Subgraph access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. - -### Specifics - -There are two methods of enabling this technology: - -1. **Using Substreams [triggers](/sps/triggers/)**: Consume from any Substreams module by importing the Protobuf model through a Subgraph handler and move all your logic into a Subgraph. This method creates the Subgraph entities directly in the Subgraph. - -2. **Using [Entity Changes](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out)**: By writing more of the logic into Substreams, you can consume the module's output directly into [graph-node](/indexing/tooling/graph-node/). In graph-node, you can use the Substreams data to create your Subgraph entities. - -You can choose where to place your logic, either in the Subgraph or Substreams. However, consider what aligns with your data needs, as Substreams has a parallelized model, and triggers are consumed linearly in the graph node. - -### اضافی وسائل - -Visit the following links for tutorials on using code-generation tooling to build your first end-to-end Substreams project quickly: - -- [Solana](/substreams/developing/solana/transactions/) -- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm) -- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet) -- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective) -- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra) -- [Stellar](https://docs.substreams.dev/tutorials/intro-to-tutorials/stellar) diff --git a/website/src/pages/ur/substreams/sps/triggers.mdx b/website/src/pages/ur/substreams/sps/triggers.mdx deleted file mode 100644 index e1149c68812f..000000000000 --- a/website/src/pages/ur/substreams/sps/triggers.mdx +++ /dev/null @@ -1,47 +0,0 @@ ---- -title: Substreams Triggers ---- - -Use Custom Triggers and enable the full use GraphQL. - -## جائزہ - -Custom Triggers allow you to send data directly into your Subgraph mappings file and entities, which are similar to tables and fields. This enables you to fully use the GraphQL layer. - -By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data in your Subgraph's handler. This ensures efficient and streamlined data management within the Subgraph framework. - -### Defining `handleTransactions` - -The following code demonstrates how to define a `handleTransactions` function in a Subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new Subgraph entity is created. - -```tsx -export function handleTransactions(bytes: Uint8Array): void { - let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).transactions // 1. - if (transactions.length == 0) { - log.info('No transactions found', []) - return - } - - for (let i = 0; i < transactions.length; i++) { - // 2. - let transaction = transactions[i] - - let entity = new Transaction(transaction.hash) // 3. - entity.from = transaction.from - entity.to = transaction.to - entity.save() - } -} -``` - -Here's what you're seeing in the `mappings.ts` file: - -1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object -2. Looping over the transactions -3. Create a new Subgraph entity for every transaction - -To go through a detailed example of a trigger-based Subgraph, [check out the tutorial](/sps/tutorial/). - -### اضافی وسائل - -To scaffold your first project in the Development Container, check out one of the [How-To Guide](/substreams/developing/dev-container/). diff --git a/website/src/pages/ur/substreams/sps/tutorial.mdx b/website/src/pages/ur/substreams/sps/tutorial.mdx deleted file mode 100644 index 841654e04782..000000000000 --- a/website/src/pages/ur/substreams/sps/tutorial.mdx +++ /dev/null @@ -1,155 +0,0 @@ ---- -title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' -sidebarTitle: Tutorial ---- - -Successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. - -## شروع کریں - -For a video tutorial, check out [How to Index Solana with a Substreams-powered Subgraph](/sps/tutorial/#video-tutorial) - -### Prerequisites - -Before starting, make sure to: - -- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. -- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. - -### Step 1: Initialize Your Project - -1. Open your Dev Container and run the following command to initialize your project: - - ```bash - substreams init - ``` - -2. Select the "minimal" project option. - -3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: - -```yaml -specVersion: v0.1.0 -package: - name: my_project_sol - version: v0.1.0 - -imports: # Pass your spkg of interest - solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg - -modules: - - name: map_spl_transfers - use: solana:map_block # Select corresponding modules available within your spkg - initialBlock: 260000082 - - - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes - -network: solana-mainnet-beta - -params: # Modify the param fields to meet your needs - # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA - map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE -``` - -### Step 2: Generate the Subgraph Manifest - -Once the project is initialized, generate a Subgraph manifest by running the following command in the Dev Container: - -```bash -substreams codegen subgraph -``` - -You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: - -```yaml ---- -dataSources: - - kind: substreams - name: my_project_sol - network: solana-mainnet-beta - source: - package: - moduleName: map_spl_transfers # Module defined in the substreams.yaml - file: ./my-project-sol-v0.1.0.spkg - mapping: - apiVersion: 0.0.9 - kind: substreams/graph-entities - file: ./src/mappings.ts - handler: handleTriggers -``` - -### Step 3: Define Entities in `schema.graphql` - -Define the fields you want to save in your Subgraph entities by updating the `schema.graphql` file. - -Here is an example: - -```graphql -type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! -} -``` - -This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. - -### Step 4: Handle Substreams Data in `mappings.ts` - -With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. - -The example below demonstrates how to extract to Subgraph entities the non-derived transfers associated to the Orca account id: - -```ts -import { Protobuf } from 'as-proto/assembly' -import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' -import { MyTransfer } from '../generated/schema' - -export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) - - for (let i = 0; i < input.data.length; i++) { - const event = input.data[i] - - if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}` - const entity = new MyTransfer(entity_id) - entity.amount = event.transfer!.instruction!.amount.toString() - entity.source = event.transfer!.accounts!.source - entity.designation = event.transfer!.accounts!.destination - - if (event.transfer!.accounts!.signer!.single != null) { - entity.signers = [event.transfer!.accounts!.signer!.single!.signer] - } else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers - } - entity.save() - } - } -} -``` - -### Step 5: Generate Protobuf Files - -To generate Protobuf objects in AssemblyScript, run the following command: - -```bash -npm run protogen -``` - -This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the Subgraph's handler. - -### Conclusion - -Congratulations! You've successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. You can take the next step by customizing your schema, mappings, and modules to fit your specific use case. - -### Video Tutorial - - - -### اضافی وسائل - -For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/src/pages/vi/substreams/sps/_meta.js b/website/src/pages/vi/substreams/sps/_meta.js deleted file mode 100644 index 86fcd3df5ec0..000000000000 --- a/website/src/pages/vi/substreams/sps/_meta.js +++ /dev/null @@ -1,6 +0,0 @@ -export default { - introduction: '', - triggers: '', - tutorial: '', - faq: '', -} diff --git a/website/src/pages/vi/substreams/sps/faq.mdx b/website/src/pages/vi/substreams/sps/faq.mdx deleted file mode 100644 index 250c466d5929..000000000000 --- a/website/src/pages/vi/substreams/sps/faq.mdx +++ /dev/null @@ -1,96 +0,0 @@ ---- -title: Substreams-Powered Subgraphs FAQ -sidebarTitle: FAQ ---- - -## What are Substreams? - -Substreams is an exceptionally powerful processing engine capable of consuming rich streams of blockchain data. It allows you to refine and shape blockchain data for fast and seamless digestion by end-user applications. - -Specifically, it's a blockchain-agnostic, parallelized, and streaming-first engine that serves as a blockchain data transformation layer. It's powered by [Firehose](https://firehose.streamingfast.io/), and enables developers to write Rust modules, build upon community modules, provide extremely high-performance indexing, and [sink](/substreams/developing/sinks/) their data anywhere. - -Substreams is developed by [StreamingFast](https://www.streamingfast.io/). Visit the [Substreams Documentation](/substreams/introduction/) to learn more about Substreams. - -## What are Substreams-powered Subgraphs? - -[Substreams-powered Subgraphs](/sps/introduction/) combine the power of Substreams with the queryability of Subgraphs. When publishing a Substreams-powered Subgraph, the data produced by the Substreams transformations, can [output entity changes](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs) that are compatible with Subgraph entities. - -If you are already familiar with Subgraph development, note that Substreams-powered Subgraphs can then be queried just as if it had been produced by the AssemblyScript transformation layer. This provides all the benefits of Subgraphs, including a dynamic and flexible GraphQL API. - -## How are Substreams-powered Subgraphs different from Subgraphs? - -Subgraphs are made up of datasources which specify onchain events, and how those events should be transformed via handlers written in Assemblyscript. These events are processed sequentially, based on the order in which events happen onchain. - -By contrast, substreams-powered Subgraphs have a single datasource which references a substreams package, which is processed by the Graph Node. Substreams have access to additional granular onchain data compared to conventional Subgraphs, and can also benefit from massively parallelised processing, which can mean much faster processing times. - -## What are the benefits of using Substreams-powered Subgraphs? - -Substreams-powered Subgraphs combine all the benefits of Substreams with the queryability of Subgraphs. They bring greater composability and high-performance indexing to The Graph. They also enable new data use cases; for example, once you've built your Substreams-powered Subgraph, you can reuse your [Substreams modules](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) to output to different [sinks](https://substreams.streamingfast.io/reference-and-specs/manifests#sink) such as PostgreSQL, MongoDB, and Kafka. - -## What are the benefits of Substreams? - -There are many benefits to using Substreams, including: - -- Composable: You can stack Substreams modules like LEGO blocks, and build upon community modules, further refining public data. - -- High-performance indexing: Orders of magnitude faster indexing through large-scale clusters of parallel operations (think BigQuery). - -- Sink anywhere: Sink your data to anywhere you want: PostgreSQL, MongoDB, Kafka, Subgraphs, flat files, Google Sheets. - -- Programmable: Use code to customize extraction, do transformation-time aggregations, and model your output for multiple sinks. - -- Access to additional data which is not available as part of the JSON RPC - -- All the benefits of the Firehose. - -## What is the Firehose? - -Developed by [StreamingFast](https://www.streamingfast.io/), the Firehose is a blockchain data extraction layer designed from scratch to process the full history of blockchains at speeds that were previously unseen. Providing a files-based and streaming-first approach, it is a core component of StreamingFast's suite of open-source technologies and the foundation for Substreams. - -Go to the [documentation](https://firehose.streamingfast.io/) to learn more about the Firehose. - -## What are the benefits of the Firehose? - -There are many benefits to using Firehose, including: - -- Lowest latency & no polling: In a streaming-first fashion, the Firehose nodes are designed to race to push out the block data first. - -- Prevents downtimes: Designed from the ground up for High Availability. - -- Never miss a beat: The Firehose stream cursor is designed to handle forks and to continue where you left off in any condition. - -- Richest data model:  Best data model that includes the balance changes, the full call tree, internal transactions, logs, storage changes, gas costs, and more. - -- Leverages flat files: Blockchain data is extracted into flat files, the cheapest and most optimized computing resource available. - -## Where can developers access more information about Substreams-powered Subgraphs and Substreams? - -The [Substreams documentation](/substreams/introduction/) will teach you how to build Substreams modules. - -The [Substreams-powered Subgraphs documentation](/sps/introduction/) will show you how to package them for deployment on The Graph. - -The [latest Substreams Codegen tool](https://streamingfastio.medium.com/substreams-codegen-no-code-tool-to-bootstrap-your-project-a11efe0378c6) will allow you to bootstrap a Substreams project without any code. - -## What is the role of Rust modules in Substreams? - -Rust modules are the equivalent of the AssemblyScript mappers in Subgraphs. They are compiled to WASM in a similar way, but the programming model allows for parallel execution. They define the sort of transformations and aggregations you want to apply to the raw blockchain data. - -See [modules documentation](https://docs.substreams.dev/reference-material/substreams-components/modules#modules) for details. - -## What makes Substreams composable? - -When using Substreams, the composition happens at the transformation layer enabling cached modules to be re-used. - -As an example, Alice can build a DEX price module, Bob can use it to build a volume aggregator for some tokens of his interest, and Lisa can combine four individual DEX price modules to create a price oracle. A single Substreams request will package all of these individual's modules, link them together, to offer a much more refined stream of data. That stream can then be used to populate a Subgraph, and be queried by consumers. - -## How can you build and deploy a Substreams-powered Subgraph? - -After [defining](/sps/introduction/) a Substreams-powered Subgraph, you can use the Graph CLI to deploy it in [Subgraph Studio](https://thegraph.com/studio/). - -## Where can I find examples of Substreams and Substreams-powered Subgraphs? - -You can visit [this Github repo](https://github.com/pinax-network/awesome-substreams) to find examples of Substreams and Substreams-powered Subgraphs. - -## What do Substreams and Substreams-powered Subgraphs mean for The Graph Network? - -The integration promises many benefits, including extremely high-performance indexing and greater composability by leveraging community modules and building on them. diff --git a/website/src/pages/vi/substreams/sps/introduction.mdx b/website/src/pages/vi/substreams/sps/introduction.mdx deleted file mode 100644 index bd0bb34b8342..000000000000 --- a/website/src/pages/vi/substreams/sps/introduction.mdx +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: Introduction to Substreams-Powered Subgraphs -sidebarTitle: Giới thiệu ---- - -Boost your Subgraph's efficiency and scalability by using [Substreams](/substreams/introduction/) to stream pre-indexed blockchain data. - -## Tổng quan - -Use a Substreams package (`.spkg`) as a data source to give your Subgraph access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks. - -### Specifics - -There are two methods of enabling this technology: - -1. **Using Substreams [triggers](/sps/triggers/)**: Consume from any Substreams module by importing the Protobuf model through a Subgraph handler and move all your logic into a Subgraph. This method creates the Subgraph entities directly in the Subgraph. - -2. **Using [Entity Changes](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out)**: By writing more of the logic into Substreams, you can consume the module's output directly into [graph-node](/indexing/tooling/graph-node/). In graph-node, you can use the Substreams data to create your Subgraph entities. - -You can choose where to place your logic, either in the Subgraph or Substreams. However, consider what aligns with your data needs, as Substreams has a parallelized model, and triggers are consumed linearly in the graph node. - -### Additional Resources - -Visit the following links for tutorials on using code-generation tooling to build your first end-to-end Substreams project quickly: - -- [Solana](/substreams/developing/solana/transactions/) -- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm) -- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet) -- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective) -- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra) -- [Stellar](https://docs.substreams.dev/tutorials/intro-to-tutorials/stellar) diff --git a/website/src/pages/vi/substreams/sps/triggers.mdx b/website/src/pages/vi/substreams/sps/triggers.mdx deleted file mode 100644 index 41b53829a5e7..000000000000 --- a/website/src/pages/vi/substreams/sps/triggers.mdx +++ /dev/null @@ -1,47 +0,0 @@ ---- -title: Substreams Triggers ---- - -Use Custom Triggers and enable the full use GraphQL. - -## Tổng quan - -Custom Triggers allow you to send data directly into your Subgraph mappings file and entities, which are similar to tables and fields. This enables you to fully use the GraphQL layer. - -By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data in your Subgraph's handler. This ensures efficient and streamlined data management within the Subgraph framework. - -### Defining `handleTransactions` - -The following code demonstrates how to define a `handleTransactions` function in a Subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new Subgraph entity is created. - -```tsx -export function handleTransactions(bytes: Uint8Array): void { - let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).transactions // 1. - if (transactions.length == 0) { - log.info('No transactions found', []) - return - } - - for (let i = 0; i < transactions.length; i++) { - // 2. - let transaction = transactions[i] - - let entity = new Transaction(transaction.hash) // 3. - entity.from = transaction.from - entity.to = transaction.to - entity.save() - } -} -``` - -Here's what you're seeing in the `mappings.ts` file: - -1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object -2. Looping over the transactions -3. Create a new Subgraph entity for every transaction - -To go through a detailed example of a trigger-based Subgraph, [check out the tutorial](/sps/tutorial/). - -### Additional Resources - -To scaffold your first project in the Development Container, check out one of the [How-To Guide](/substreams/developing/dev-container/). diff --git a/website/src/pages/vi/substreams/sps/tutorial.mdx b/website/src/pages/vi/substreams/sps/tutorial.mdx deleted file mode 100644 index 05036a1b24ae..000000000000 --- a/website/src/pages/vi/substreams/sps/tutorial.mdx +++ /dev/null @@ -1,155 +0,0 @@ ---- -title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana' -sidebarTitle: Tutorial ---- - -Successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. - -## Bắt đầu - -For a video tutorial, check out [How to Index Solana with a Substreams-powered Subgraph](/sps/tutorial/#video-tutorial) - -### Prerequisites - -Before starting, make sure to: - -- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container. -- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs. - -### Step 1: Initialize Your Project - -1. Open your Dev Container and run the following command to initialize your project: - - ```bash - substreams init - ``` - -2. Select the "minimal" project option. - -3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID: - -```yaml -specVersion: v0.1.0 -package: - name: my_project_sol - version: v0.1.0 - -imports: # Pass your spkg of interest - solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg - -modules: - - name: map_spl_transfers - use: solana:map_block # Select corresponding modules available within your spkg - initialBlock: 260000082 - - - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes - -network: solana-mainnet-beta - -params: # Modify the param fields to meet your needs - # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA - map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE -``` - -### Step 2: Generate the Subgraph Manifest - -Once the project is initialized, generate a Subgraph manifest by running the following command in the Dev Container: - -```bash -substreams codegen subgraph -``` - -You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source: - -```yaml ---- -dataSources: - - kind: substreams - name: my_project_sol - network: solana-mainnet-beta - source: - package: - moduleName: map_spl_transfers # Module defined in the substreams.yaml - file: ./my-project-sol-v0.1.0.spkg - mapping: - apiVersion: 0.0.9 - kind: substreams/graph-entities - file: ./src/mappings.ts - handler: handleTriggers -``` - -### Step 3: Define Entities in `schema.graphql` - -Define the fields you want to save in your Subgraph entities by updating the `schema.graphql` file. - -Here is an example: - -```graphql -type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! -} -``` - -This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`. - -### Step 4: Handle Substreams Data in `mappings.ts` - -With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. - -The example below demonstrates how to extract to Subgraph entities the non-derived transfers associated to the Orca account id: - -```ts -import { Protobuf } from 'as-proto/assembly' -import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' -import { MyTransfer } from '../generated/schema' - -export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) - - for (let i = 0; i < input.data.length; i++) { - const event = input.data[i] - - if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}` - const entity = new MyTransfer(entity_id) - entity.amount = event.transfer!.instruction!.amount.toString() - entity.source = event.transfer!.accounts!.source - entity.designation = event.transfer!.accounts!.destination - - if (event.transfer!.accounts!.signer!.single != null) { - entity.signers = [event.transfer!.accounts!.signer!.single!.signer] - } else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers - } - entity.save() - } - } -} -``` - -### Step 5: Generate Protobuf Files - -To generate Protobuf objects in AssemblyScript, run the following command: - -```bash -npm run protogen -``` - -This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the Subgraph's handler. - -### Conclusion - -Congratulations! You've successfully set up a trigger-based Substreams-powered Subgraph for a Solana SPL token. You can take the next step by customizing your schema, mappings, and modules to fit your specific use case. - -### Video Tutorial - - - -### Additional Resources - -For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana). diff --git a/website/src/pages/zh/substreams/sps/_meta.js b/website/src/pages/zh/substreams/sps/_meta.js deleted file mode 100644 index 86fcd3df5ec0..000000000000 --- a/website/src/pages/zh/substreams/sps/_meta.js +++ /dev/null @@ -1,6 +0,0 @@ -export default { - introduction: '', - triggers: '', - tutorial: '', - faq: '', -} diff --git a/website/src/pages/zh/substreams/sps/faq.mdx b/website/src/pages/zh/substreams/sps/faq.mdx deleted file mode 100644 index 6fe73c72b8da..000000000000 --- a/website/src/pages/zh/substreams/sps/faq.mdx +++ /dev/null @@ -1,96 +0,0 @@ ---- -title: Substreams驱动的子图的常见问题 -sidebarTitle: 常见问题 ---- - -## 什么是Substreams? - -Substreams是一个非常强大的处理引擎,能够消耗丰富的区块链数据流。它允许您优化和塑造区块链数据,以便最终用户应用程序快速无缝地消化。 - -具体来说,它是一个与区块链无关的、并行化的、流媒体优先的引擎,充当区块链数据转换层。它由[Firehose](https://firehose.streamingfast.io/)提供支持,使开发人员能够编写Rust模块,构建社区模块,提供极高性能的索引,并将数据[存储](/substreams/developing/sinks/) 在任何地方。 - -Substreams由[StreamingFast](https://www.streamingfast.io/)开发。访问[Substreams文档](/substreams/introduction/)以了解有关Substreams的更多信息。 - -## 什么是Substreams驱动的子图? - -[Substreams驱动的子图](/sps/introduction/)结合了Substreams的强大功能和子图的可查询性。发布基于Substreams的子图时,Substreams转换生成的数据可以输出与子图实体兼容的[实体更改](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs)。 - -如果您已经熟悉子图Subgraph开发,那么请注意,Substreams驱动的子图可以被查询,就像它是由AssemblyScript转换层生成的一样。它具有所有子图的优势,比如提供动态和灵活的GraphQL API。 - -## Substreams驱动的子图和普通子图有什么区别? - -子图由数据源组成,这些数据源指定了在链上发生的事件以及通过用Assemblyscript编写的处理程序应如何转换这些事件。这些事件按照链上发生事件的顺序依次进行处理。 - -相比之下,由Substreams驱动的子图具有单一数据源,该数据源引用一个由Graph节点进行处理的Substreams包。与传统的子图相比,Substreams可以访问更多精细的链上数据,并且还可以从大规模并行处理中获益,这可能意味着处理时间更快。 - -## 使用Substeams驱动的子图的优势是什么? - -Substreams驱动的子图结合了Substreams的所有优点和子图的可查询性。它们为The Graph带来了更大的兼容性和高性能索引。它们还支持新的数据用例:例如,一旦构建了基于Substreams的子图,就可以重用[Subreams模块](https://substreams.streamingfast.io/documentation/develop/manifest-modules)输出到不同的[sinks](https://substreams.streamingfast.io/reference-and-specs/manifests#sink),例如PostgreSQL、MongoDB和Kafka。 - -## Substreams的优势是什么? - -使用Substreams有许多好处,包括: - -- 可组合的: Substreams模块具有可组合性,就像乐高积木一样,您可以将它们堆叠起来,并在社区模块的基础上构建,进一步细化公共数据。 - -- 高性能索引:大规模并行操作(类似于BigQuery)能够使索引速度提升数个数量级。 - -- 任意传输:将您的数据传输到您想要的任何地方:PostgreSQL、MongoDB、Kafka、子图、平面文件、Google Sheets等。 - -- 可编程:使用代码自定义提取、进行转换时聚合,并为多个传输目标建模输出。 - -- 访问不作为JSON RPC的一部分的附加数据。 - -- Firehose的全部好处。 - -## 什么是Firehose? - -Firehose是由[StreamingFast](https://www.streamingfast.io/)开发的区块链数据提取层,从零开始设计,以前所未有的速度处理区块链的完整历史。它提供基于文件和流式优先的方法,是StreamingFast开源技术套件的核心组件,也是Substreams的基础。 - -请访问[documentation](https://firehose.streamingfast.io/),了解更多关于Firehose的信息。 - -## Firehose的优势是什么? - -使用Firehose有许多好处,包括: - -- 最低延迟和无需轮询:Firehose节点以流优先的方式设计,竞相将块数据推送出去。 - -- 防止宕机:从头开始为高可用性而设计。 - -- 不会错过任何数据:Firehose流游标设计用于处理分叉,并在任何情况下都可以继续从上次离开的地方开始。 - -- 最丰富的数据模型:包含余额变化、完整的调用树、内部交易、日志、存储变更、燃气费用等最佳数据模型。 - -- 利用平面文件:将区块链数据提取到平面文件中,这是目前最便宜、最优化的计算资源。 - -## 开发人员在哪里可以获得关于Substreams驱动的子图和Substreams的更多信息? - -[Substreams文档](/substreams/introduction/)将教您如何构建Substreams模块。 - -[Substreams驱动的子图文档](/sps/introduction/)将向您展示如何将它们打包部署在The Graph上。 - -[最新的Substreams Codegen工具](https://streamingfastio.medium.com/substreams-codegen-no-code-tool-to-bootstrap-your-project-a11efe0378c6)将允许您无需任何代码即可引导Substreams项目。 - -## Rust模块在Substreams中扮演什么角色? - -Rust模块相当于子图中的AssemblyScript映射。它们以类似的方式编译为WASM,但编程模型允许并行执行。它们定义了您想要对原始区块链数据应用的转换和聚合类型。 - -请参阅[模块文档](https://docs.substreams.dev/reference-material/substreams-components/modules#modules)了解详情。 - -## 什么使Substreams具有组合性? - -在使用Substreams时,组合发生在转换层,从而使得缓存模块可以被重复使用。 - -举例来说,Alice可以构建一个DEX价格模块,Bob可以使用它来构建一种感兴趣的代币的交易量集成器,Lisa可以将四个单独的DEX价格模块组合起来创建一个价格预言机。一个单独的Substreams请求将打包所有这些个人模块,并将它们链接在一起,提供一个更加精细的数据流。然后可以使用该数据流填充子图,并由消费者查询。 - -## 如何构建和部署Substreams驱动的子图? - -在[定义](/sps/introduction/) 一个Substreams驱动的子图后,您可以使用Graph CLI在[Subgraph Studio](https://thegraph.com/studio/)中部署它。 - -## 在哪里可以找到Substreams和Substreams驱动的子图的示例? - -您可以访问[此Github repo](https://github.com/pinax-network/awesome-substreams) 以找到Substreams和Substreams-powered子图的示例。 - -## Substreams和Substreams驱动的子图对于The Graph网络意味着什么? - -这种集成带来许多好处,包括通过利用社区模块和在其上构建的组合性来实现极高性能的索引。 diff --git a/website/src/pages/zh/substreams/sps/introduction.mdx b/website/src/pages/zh/substreams/sps/introduction.mdx deleted file mode 100644 index bb5579cb7b65..000000000000 --- a/website/src/pages/zh/substreams/sps/introduction.mdx +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: Substreams驱动子图介绍 -sidebarTitle: 介绍 ---- - -使用 [Substreams](/substreams/introduction/) 流式传输预索引的区块链数据,提高子图的效率和伸缩能力。 - -## 概述 - -通过使用Substreams包(`.spkg`)作为数据源,您的子图可以访问预先索引的区块链数据流。这使得数据处理更加高效和可扩展,特别是在大型或复杂的区块链网络中。 - -### 详情 - -启用此技术有两种方法: - -1. **使用Substreams[触发器](/sps/triggers/)**:通过子图处理程序导入Protobuf模型,从任何Substreams模块中消费,并将所有逻辑移动到子图中。此方法直接在子图中创建子图实体。 - -2. **使用[实体更改](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out)**:通过将更多的逻辑写入Substreams,您可以将模块的输出直接消耗到[graph节点](/indexing/tooling/graph-node/)中。在graph节点中,可以使用Substreams数据创建子图实体。 - -您可以选择在子图或子流中放置逻辑的位置。但是,考虑一下什么符合您的数据需求,因为Substreams有一个并行化的模型,触发器在graph节点中是线性消耗的。 - -### 其他资源 - -请访问以下教程链接,了解如何使用代码生成工具快速构建您的第一个端到端Substreams项目: - -- [Solana](/substreams/developing/solana/transactions/) -- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm) -- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet) -- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective) -- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra) -- [Stellar](https://docs.substreams.dev/tutorials/intro-to-tutorials/stellar) diff --git a/website/src/pages/zh/substreams/sps/triggers.mdx b/website/src/pages/zh/substreams/sps/triggers.mdx deleted file mode 100644 index a92760b3a388..000000000000 --- a/website/src/pages/zh/substreams/sps/triggers.mdx +++ /dev/null @@ -1,47 +0,0 @@ ---- -title: Substreams触发器 ---- - -使用自定义触发器并启用完全使用GraphQL。 - -## 概述 - -自定义触发器允许您将数据直接发送到子图映射文件和实体中,这些文件和实体类似于表和字段。这使您能够充分使用GraphQL层。 - -通过导入Substreams模块发出的Protobuf定义,您可以在子图的处理程序中接收和处理这些数据。这确保了子图框架内高效和简化的数据管理。 - -### 定义`处理交易` - -以下代码演示了如何在子图处理程序中定义`handleTransactions`函数。此函数接收原始Substreams字节作为参数,并将其解码为`Transactions`对象。对于每个交易,都会创建一个新的子图实体。 - -```tsx -export function handleTransactions(bytes: Uint8Array): void { - let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).transactions // 1. - if (transactions.length == 0) { - log.info('No transactions found', []) - return - } - - for (let i = 0; i < transactions.length; i++) { - // 2. - let transaction = transactions[i] - - let entity = new Transaction(transaction.hash) // 3. - entity.from = transaction.from - entity.to = transaction.to - entity.save() - } -} -``` - -以下是您在`mappings.ts`文件中看到的内容: - -1. 包含Substreams数据的字节被解码为生成的`交易`对象,该对象与任何其他AssemblyScript对象一样使用 -2. 循环交易 -3. 为每笔交易创建一个新的子图实体 - -要查看基于触发器的子图的详细示例,[单击此处](/sps/tutorial/)。 - -### 知识拓展 - -要在开发容器中构建你的第一个项目,请查看[操作指南](/substreams/developing/dev-container/)。 diff --git a/website/src/pages/zh/substreams/sps/tutorial.mdx b/website/src/pages/zh/substreams/sps/tutorial.mdx deleted file mode 100644 index c8ca5c967a22..000000000000 --- a/website/src/pages/zh/substreams/sps/tutorial.mdx +++ /dev/null @@ -1,155 +0,0 @@ ---- -title: 教程:在Solana上设置基于Substreams的子图 -sidebarTitle: 教程 ---- - -已成功为Solana SPL代币设置基于触发器的Substreams驱动子图。 - -## 开始 - -有关视频教程,请查看[如何使用Substreams驱动的子图对Solana进行索引](/sps/tutorial/#video-tutorial) - -### 先决条件 - -开始之前,请确保: - -- 完成[入门指南](https://github.com/streamingfast/substreams-starter)使用Dev容器设置开发环境。 -- 熟悉The Graph和基本的区块链概念,如交易和Protobuf。 - -### 步骤1:初始化您的项目 - -1. 打开Dev容器并运行以下命令以初始化项目: - - ```bash - substreams init - ``` - -2. 选择“最小”项目选项。 - -3. 将生成的`substreams.yaml`文件的内容替换为以下配置,该配置过滤SPL代币程序ID上Orca帐户的交易: - -```yaml -specVersion: v0.1.0 -package: - name: my_project_sol - version: v0.1.0 - -imports: # Pass your spkg of interest - solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg - -modules: - - name: map_spl_transfers - use: solana:map_block # Select corresponding modules available within your spkg - initialBlock: 260000082 - - - name: map_transactions_by_programid - use: solana:solana:transactions_by_programid_without_votes - -network: solana-mainnet-beta - -params: # Modify the param fields to meet your needs - # For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA - map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE -``` - -### 步骤2:生成子图清单 - -项目初始化后,通过在Dev容器中运行以下命令生成子图清单: - -```bash -substreams codegen subgraph -``` - -您将生成`asubgraph.yaml`清单,该清单将Substreams包作为数据源导入: - -```yaml ---- -dataSources: - - kind: substreams - name: my_project_sol - network: solana-mainnet-beta - source: - package: - moduleName: map_spl_transfers # Module defined in the substreams.yaml - file: ./my-project-sol-v0.1.0.spkg - mapping: - apiVersion: 0.0.9 - kind: substreams/graph-entities - file: ./src/mappings.ts - handler: handleTriggers -``` - -### 步骤3:在`schema.graphql`中定义实体 - -通过更新`schema.graphql`文件来定义要保存在子图实体中的字段。 - -以下是一个示例: - -```graphql -type MyTransfer @entity { - id: ID! - amount: String! - source: String! - designation: String! - signers: [String!]! -} -``` - -此模式定义了一个名为`MyTransfer`的实体,其字段包括 `id`, `amount`, `source`, `designation`, 和`signers`。 - -### 步骤4:在`mappings.ts`中处理Substreams数据 - -生成Protobuf对象后,您现在可以在`./src`目录里找到的`mappings.ts`文件中处理解码的Substreams数据。 - -下面的示例演示了如何将与Orca帐户id关联的非派生传输提取到子图实体中: - -```ts -import { Protobuf } from 'as-proto/assembly' -import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events' -import { MyTransfer } from '../generated/schema' - -export function handleTriggers(bytes: Uint8Array): void { - const input: protoEvents = Protobuf.decode(bytes, protoEvents.decode) - - for (let i = 0; i < input.data.length; i++) { - const event = input.data[i] - - if (event.transfer != null) { - let entity_id: string = `${event.txnId}-${i}` - const entity = new MyTransfer(entity_id) - entity.amount = event.transfer!.instruction!.amount.toString() - entity.source = event.transfer!.accounts!.source - entity.designation = event.transfer!.accounts!.destination - - if (event.transfer!.accounts!.signer!.single != null) { - entity.signers = [event.transfer!.accounts!.signer!.single!.signer] - } else if (event.transfer!.accounts!.signer!.multisig != null) { - entity.signers = event.transfer!.accounts!.signer!.multisig!.signers - } - entity.save() - } - } -} -``` - -### 步骤5:生成Protobuf文件 - -要在AssemblyScript中生成Protobuf对象,请运行以下命令: - -```bash -npm run protogen -``` - -此命令将Protobuf定义转换为AssemblyScript,允许您在子图的处理程序中使用它们。 - -### 结论 - -恭喜!您已成功为Solana SPL代币设置了基于触发器的Substreams驱动子图。现在,您可以进一步定制您的模式、映射和模块,以适应您的特定用例。 - -### 视频教程 - - - -### 其他资源 - -如需更高级的定制和优化,请查看官方[Substreams文档](https://substreams.streamingfast.io/tutorials/solana)。