Skip to content

Commit

Permalink
docs: update secretagent docs
Browse files Browse the repository at this point in the history
  • Loading branch information
blakebyrnes committed Nov 20, 2020
1 parent 0556411 commit e1e095b
Show file tree
Hide file tree
Showing 4 changed files with 23 additions and 4 deletions.
8 changes: 6 additions & 2 deletions website/docs/BasicInterfaces/SecretAgent.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,9 +13,13 @@ const SecretAgent = require('secret-agent');
})();
```

Unlike most other browsers, SecretAgent is initialized with a single window that can spawn tabs. Only a single tab can be focused at a time, meaning clicks and other user interaction will go to the active tab.
A SecretAgent instance can be thought of as a single user browsing session. An instance has a [replayable](../advanced/session-replay) [Session](../advanced/session) that will record all commands, dom changes, interaction and page events.

Each SecretAgent instance has its own cache, cookies, session data, and [BrowserEmulator](../advanced-features/browser-emulators). No data is shared between instances -- each operates within an airtight sandbox to ensure no identities leak across requests.
Instances are very lightweight, sharing a pool of browsers underneath. You should create a new instance for activity that would benefit from parallelization by multiple "users".

SecretAgent instances can have multiple [Tabs](./tab), but only a single tab can be focused at a time. Clicks and other user interaction will go to the active tab (interacting with multiple tabs at once by a single user is easily detectable).

Each SecretAgent instance creates a private environment with its own cache, cookies, session data, and [BrowserEmulator](../advanced-features/browser-emulators). No data is shared between instances -- each operates within an airtight sandbox to ensure no identities leak across requests.

## Constructor

Expand Down
15 changes: 15 additions & 0 deletions website/docs/Overview/BasicConcepts.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,4 +63,19 @@ When used in a simple example as show above, Puppeteer's approach seems okay. Ho

## Headless Browsers Need Not Always Render

When you're trying to eke out performance, a common technique is to disable rendering various parts of a webpage. SecretAgent allows you to [turn off](./configuration#rendering) everything from the style and images of a page, to the javascript environment. You can even simulate making http requests from inside a loaded web page, without ever loading the page.

```js
const SecretAgent = require('secret-agent');

const agent = new SecretAgent({
renderingOptions: ['None'],
});
await agent.goto('https://secretagent.dev');
// referer will be https://secretagent.dev
const doc = await agent.fetch('https://secretagent.dev/docs/overview/configuration');
```

## Mice and Keyboards Are Human Too

SecretAgent drives mice and keyboards with [Human Emulators](../advanced/human-emulators). Human emulators translate your clicks and moves into randomized human-like patterns that can pass bot-blocker checks.
2 changes: 1 addition & 1 deletion website/docs/Overview/Configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ This can only be set on SecretAgent during the first instantiation or [`SecretAg

Configurable via [`Core.configure()`](#core-configure) or [`Core.prewarm()`](#core-prewarm).

### Rendering Options <div class="specs"><i>Class</i><i>Instance</i></div>
### Rendering Options <div class="specs"><i>Class</i><i>Instance</i></div> {#rendering}

One of the best ways to optimize SecretAgent's memory and CPU is limiting the `renderingOptions` to only what you need. The following are valid options.

Expand Down
2 changes: 1 addition & 1 deletion website/docs/Overview/Introduction.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
- **Built for scraping** - it's the first modern headless browsers designed specifically for scraping instead of just automated testing.
- **Designed for web developers** - We've recreated a fully compliant DOM directly in NodeJS allowing you bypass the headaches of previous scraper tools.
- **Powered by Chromium** - The powerful Chromium engine sits under the hood, allowing for lightning fast rendering.
- **Emulates any modern browser** - BrowserEmulator plugins make it easy to disguise your script as practically any browser.
- **Emulates any modern browser** - Browser emulators make it easy to disguise your script as practically any browser.
- **Avoids detection along the entire stack** - Don't be blocked because of TLS fingerprints in your networking stack.

## How It Works
Expand Down

0 comments on commit e1e095b

Please sign in to comment.