You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: sources/academy/webscraping/scraping_basics_python/13_platform.md
+46-1
Original file line number
Diff line number
Diff line change
@@ -29,9 +29,54 @@ That said, the main goal of this lesson is to show how deploying to **any platfo
29
29
30
30
:::
31
31
32
+
## Packaging the project
33
+
34
+
Until now we've been adding dependencies to our project by only installing them by `pip` inside an activated virtual environment. If we were to send our code to a friend, they wouldn't know what they need to install before they can run the scraper without import errors. The same applies if we were to send our code to a cloud platform.
35
+
36
+
In the root of the project, let's create a file called `requirements.txt`, with a single line consisting of a single word:
37
+
38
+
```text title="requirements.txt"
39
+
crawlee
40
+
```
41
+
42
+
Each line in the file represents a single dependency, but our program has just one. With `requirements.txt` in place, Apify can run `pip install -r requirements.txt` to download and install all dependencies of the project before starting our program.
43
+
44
+
:::tip Packaging projects
45
+
46
+
The [requirements file](https://pip.pypa.io/en/latest/user_guide/#requirements-files) is an obsolete approach to packaging a Python project, but it still works and it's the simplest, which is convenient for the purposes of this lesson.
47
+
48
+
For any serious work the best and future-proof approach to packaging is to create the [`pyproject.toml`](https://packaging.python.org/en/latest/guides/writing-pyproject-toml/) configuration file. We recommend the official [Python Packaging User Guide](https://packaging.python.org/) for more info.
49
+
50
+
:::
51
+
32
52
## Registering
33
53
34
-
## Something
54
+
As a second step, let's [create a new Apify account](https://console.apify.com/sign-up). The process includes several verifications that you're a human being and that your e-mail address is valid. While annoying, these are necessary measures to prevent abuse of the platform.
55
+
56
+
Apify serves both as an infrastructure where to privately deploy and run own scrapers, and as a marketplace, where anyone can offer their ready scrapers to others for rent. We'll overcome our curiosity for now and leave exploring the Apify Store for later.
57
+
58
+
## Getting access from the command line
59
+
60
+
To control the platform from our machine and send the code of our program there, we'll need the Apify CLI. On macOS, we can install the CLI using [Homebrew](https://brew.sh), otherwise we'll first need [Node.js](https://nodejs.org/en/download).
61
+
62
+
After following the [Apify CLI installation guide](https://docs.apify.com/cli/docs/installation), we'll verify that we installed the tool by printing its version:
63
+
64
+
```text
65
+
$ apify --version
66
+
apify-cli/0.0.0 system-arch00 node-v0.0.0
67
+
```
68
+
69
+
Now let's connect the CLI with the platform using our account:
0 commit comments