|
1 |
| -# Intel® AI Analytics Toolkit (AI Kit) Container Sample |
| 1 | +# `Intel® AI Analytics Toolkit (AI Kit) Container Getting Started` Sample |
2 | 2 |
|
3 |
| -Containers allow you to set up and configure environments for |
4 |
| -building, running, and profiling AI applications and distribute |
5 |
| -them using images. You can also use Kubernetes* to automate the |
6 |
| -deployment and management of containers in the cloud. |
| 3 | +The `Intel® AI Analytics Toolkit (AI Kit) Container Getting Started` sample demonstrates how to use AI Kit containers. |
7 | 4 |
|
8 |
| -This get started sample shows the easiest way to start using any of |
9 |
| -the [Intel® AI Analytics |
10 |
| -Toolkit (AI Kit)](https://www.intel.com/content/www/us/en/developer/tools/oneapi/ai-analytics-toolkit.html) |
11 |
| -components without the hassle of installing the toolkit, configuring |
12 |
| -networking and file sharing. |
| 5 | +| Area | Description |
| 6 | +|:--- |:--- |
| 7 | +| What you will learn | How to start using the Intel® oneapi-aikit container |
| 8 | +| Time to complete | 10 minutes |
| 9 | +| Category | Tutorial |
13 | 10 |
|
14 |
| - |
15 |
| -| Optimized for | Description |
16 |
| -|:--- |:--- |
17 |
| -| OS | Linux* Ubuntu* 18.04 |
18 |
| -| Hardware | Intel® Xeon® Scalable processor family or newer |
19 |
| -| Software | Intel® AI Analytics Toolkit |
20 |
| -| What you will learn | How to start using the Intel® oneapi-aikit container |
21 |
| -| Time to complete | 10 minutes |
| 11 | +For more information on the **oneapi-aikit** container, see [Intel AI Analytics Toolkit](https://hub.docker.com/r/intel/oneapi-aikit) Docker Hub location. |
22 | 12 |
|
23 | 13 | ## Purpose
|
24 | 14 |
|
25 |
| -This sample provides a Bash script to help users configure their Intel® AI Analytics Toolkit |
26 |
| -container environment. Developers can |
27 |
| -quickly build and train deep learning models using this Docker* |
28 |
| -environment. |
| 15 | +This sample provides a Bash script to help you configure an AI Kit container environment. You can build and train deep learning models using this Docker* environment. |
| 16 | + |
| 17 | +Containers allow you to set up and configure environments for building, running, and profiling AI applications and distribute them using images. You can also use Kubernetes* to automate the deployment and management of containers in the cloud. |
| 18 | + |
| 19 | +Read the [Get Started with the Intel® AI Analytics Toolkit for Linux*](https://www.intel.com/content/www/us/en/develop/documentation/get-started-with-ai-linux/top.html) to find out how you can achieve performance gains for popular deep-learning and machine-learning frameworks through Intel optimizations. |
| 20 | + |
| 21 | +This sample shows an easy way to start using any of the [Intel® AI Analytics Toolkit (AI Kit)](https://www.intel.com/content/www/us/en/developer/tools/oneapi/ai-analytics-toolkit.html) components without the hassle of installing the toolkit, configuring networking and file sharing. |
29 | 22 |
|
30 |
| -For more information on the one API AIKit container, see [AI Kit |
31 |
| -Container Repository](https://hub.docker.com/r/intel/oneapi-aikit). |
| 23 | +## Prerequisites |
32 | 24 |
|
| 25 | +| Optimized for | Description |
| 26 | +|:--- |:--- |
| 27 | +| OS | Ubuntu* 20.04 (or newer) |
| 28 | +| Hardware | Intel® Xeon® Scalable processor family |
| 29 | +| Software | Intel® AI Analytics Toolkit (AI Kit) |
33 | 30 |
|
34 | 31 | ## Key Implementation Details
|
35 | 32 |
|
36 | 33 | The Bash script provided in this sample performs the following
|
37 | 34 | configuration steps:
|
38 | 35 |
|
39 |
| -- Mounts the `/home` folder from host machine into the Docker |
40 |
| - container. You can share files between the host machine and the |
41 |
| - Docker container via the `/home` folder. |
| 36 | +- Mounts the `/home` folder from host machine into the Docker container. You can share files between the host machine and the Docker container through the `/home` folder. |
42 | 37 |
|
43 |
| -- Applies proxy settings from the host machine into the Docker |
44 |
| - container. |
45 |
| - |
46 |
| -- Uses the same IP addresses between the host machine and the Docker |
47 |
| - container. |
| 38 | +- Applies proxy settings from the host machine into the Docker container. |
48 | 39 |
|
49 |
| -- Forwards ports 8888, 6006, 6543, and 12345 from the host machine to |
50 |
| - the Docker container for some popular network services, such as |
51 |
| - Jupyter notebook and TensorBoard. |
52 |
| - |
| 40 | +- Uses the same IP addresses between the host machine and the Docker container. |
53 | 41 |
|
54 |
| -## Run the Sample |
| 42 | +- Forwards ports 8888, 6006, 6543, and 12345 from the host machine to the Docker container for some popular network services, such as Jupyter* Notebook and TensorFlow* TensorBoard. |
| 43 | + |
| 44 | +## Run the `Intel® AI Analytics Toolkit (AI Kit) Container Getting Started` Sample |
55 | 45 |
|
56 |
| -This sample uses a configuration script to automatically configure the |
57 |
| -environment. This provides fast and less error prone setup. For |
58 |
| -complete instructions for using the AI Kit containers see |
59 |
| -the [Getting Started Guide](https://www.intel.com/content/www/us/en/develop/documentation/get-started-with-ai-linux/top/using-containers.html.) |
| 46 | +This sample uses a configuration script to automatically configure the environment. This provides fast and less error prone setup. For complete instructions for using the AI Kit containers, see the [Getting Started Guide](https://www.intel.com/content/www/us/en/develop/documentation/get-started-with-ai-linux/top/using-containers.html). |
60 | 47 |
|
61 |
| -To run the configuration script on Linux*, type the following command |
62 |
| -in the terminal with [Docker](https://docs.docker.com/engine/install/) |
63 |
| -installed: |
| 48 | +### On Linux* |
64 | 49 |
|
| 50 | +You must have [Docker](https://docs.docker.com/engine/install/) |
| 51 | +installed. |
65 | 52 |
|
66 |
| -1. Navigate to the directory with the IntelAIKitContainer sample and pull the oneapi-aikit docker image: |
67 |
| - |
68 |
| - ``` |
69 |
| - docker pull intel/oneapi-aikit |
70 |
| - ``` |
71 |
| - > Please apply the below command and login again if a permisson denied error occurs. |
72 |
| - ``` |
73 |
| - sudo usermod -aG docker $USER |
74 |
| - ``` |
75 |
| - |
76 |
| -2. Use the `run_oneapi_docker.sh` Bash script to run the Docker image: |
| 53 | +1. Open a terminal. |
| 54 | +2. Change to the sample folder, and pull the oneapi-aikit Docker image. |
| 55 | + ``` |
| 56 | + docker pull intel/oneapi-aikit |
| 57 | + ``` |
| 58 | + >**Note**: If a permission denied error occurs, run the following command. |
| 59 | + >``` |
| 60 | + >sudo usermod -aG docker $USER |
| 61 | + >``` |
77 | 62 |
|
78 |
| - ```bash |
| 63 | +3. Run the Docker images using the `run_oneapi_docker.sh` Bash script. |
| 64 | + ``` |
79 | 65 | ./run_oneapi_docker.sh intel/oneapi-aikit
|
80 | 66 | ```
|
81 |
| - |
82 | 67 | The script opens a Bash shell inside the Docker container.
|
83 |
| - > Note : Users could install additional packages by adding them into requirements.txt. |
84 |
| - > Please copy the modified requirements.txt into /tmp folder, so the bash script will install those packages for you. |
85 |
| - |
86 |
| - To create a new Bash session in the running container from outside |
87 |
| - the Docker container, use the following: |
88 |
| - |
89 |
| - ```bash |
| 68 | + > **Note**: Install additional packages by adding them into requirements.txt file in the sample. Copy the modified requirements.txt into /tmp folder, so the bash script will install those packages for you. |
| 69 | +
|
| 70 | + To create a Bash session in the running container from outside the Docker container, enter a command similar to the following. |
| 71 | + ``` |
90 | 72 | docker exec -it aikit_container /bin/bash
|
91 | 73 | ```
|
92 |
| - |
93 |
| -3. In the Bash shell inside the Docker container, activate the oneAPI |
94 |
| - environment: |
95 |
| - |
96 |
| - ```bash |
| 74 | +4. In the Bash shell inside the Docker container, activate the specialized environment. |
| 75 | + ``` |
97 | 76 | source activate tensorflow
|
98 | 77 | ```
|
99 |
| - |
100 | 78 | or
|
101 |
| - |
102 |
| - ```bash |
| 79 | + ``` |
103 | 80 | source activate pytorch
|
104 | 81 | ```
|
105 |
| - |
106 |
| -Now you can start using Intel® Optimization for TensorFlow* or Intel |
107 |
| -Optimization for PyTorch inside the Docker container. |
108 |
| - |
109 |
| -To verify the activated environment, navigate to the directory with |
110 |
| -the IntelAIKitContainer sample and run the `version_check.py` script: |
111 |
| - |
112 |
| -```bash |
113 |
| -python version_check.py |
114 |
| -``` |
| 82 | +You can start using Intel® Optimization for TensorFlow* or Intel® Optimization for PyTorch* inside the Docker container. |
115 | 83 |
|
116 |
| -## Example of Output |
117 |
| - |
118 |
| -Output from TensorFlow Environment |
119 |
| -``` |
120 |
| -TensorFlow version: 2.6.0 |
121 |
| -MKL enabled : True |
122 |
| -``` |
123 |
| - |
124 |
| -Output from PyTorch Environment |
125 |
| -``` |
126 |
| -PyTorch Version: 1.8.0a0+37c1f4a |
127 |
| -mkldnn : True, mkl : True, openmp : True |
128 |
| -``` |
129 |
| - |
130 |
| - |
131 |
| - |
132 |
| -## Next Steps |
133 |
| - |
134 |
| -Explore the [Get Started |
135 |
| -Guide](https://www.intel.com/content/www/us/en/develop/documentation/get-started-with-ai-linux/top.html). |
136 |
| -to find out how you can achieve performance gains for popular |
137 |
| -deep-learning and machine-learning frameworks through Intel |
138 |
| -optimizations. |
| 84 | +>**Note**: You can verify the activated environment. Change to the directory with the IntelAIKitContainer sample and run the `version_check.py` script. |
| 85 | +>``` |
| 86 | +>python version_check.py |
| 87 | +>``` |
139 | 88 |
|
140 | 89 | ### Manage Docker* Images
|
141 | 90 |
|
142 |
| -You can install additional packages, upload the workloads via the |
143 |
| -`/tmp` folder, and then commit your changes into a new Docker image, |
144 |
| -for example, `intel/oneapi-aikit-v1`: |
145 |
| - |
146 |
| - |
147 |
| -```bash |
| 91 | +You can install additional packages, upload the workloads via the `/tmp` folder, and then commit your changes into a new Docker image, for example, `intel/oneapi-aikit-v1`. |
| 92 | +``` |
148 | 93 | docker commit -a "intel" -m "test" DOCKER_ID intel/oneapi-aikit-v1
|
149 | 94 | ```
|
| 95 | +>**Note**: Replace `DOCKER_ID` with the ID of your container. Use `docker ps` to get the DOCKER_ID of your Docker container. |
150 | 96 |
|
151 |
| -**NOTE:** Replace `DOCKER_ID` with the ID of your container. Use |
152 |
| -`docker ps` to get the DOCKER_ID of your Docker container. |
153 |
| - |
154 |
| -You can then use the new image name to start Docker: |
155 |
| - |
156 |
| -```bash |
| 97 | +You can use the new image name to start Docker. |
| 98 | +``` |
157 | 99 | ./run_oneapi_docker.sh intel/oneapi-aikit-v1
|
158 | 100 | ```
|
159 | 101 |
|
160 |
| -To save the Docker image as a tar file: |
161 |
| - |
162 |
| -```bash |
| 102 | +You can save the Docker image as a tar file. |
| 103 | +``` |
163 | 104 | docker save -o oneapi-aikit-v1.tar intel/oneapi-aikit-v1
|
164 | 105 | ```
|
165 | 106 |
|
166 |
| -To load the tar file on other machines: |
167 |
| - |
168 |
| -```bash |
| 107 | +You can load the tar file on other machines. |
| 108 | +``` |
169 | 109 | docker load -i oneapi-aikit-v1.tar
|
170 | 110 | ```
|
171 |
| -## Troubleshooting |
172 | 111 |
|
173 | 112 | ### Docker Proxy
|
174 | 113 |
|
175 |
| -#### Ubuntu |
176 |
| -For docker proxy related problem, you could follow below instructions to setup proxy for your docker client. |
| 114 | +For Docker proxy related problem, you could follow below instructions to configure proxy settings for your Docker client. |
| 115 | +
|
| 116 | +1. Create a directory for the Docker service configurations. |
| 117 | + ``` |
| 118 | + sudo mkdir -p /etc/systemd/system/docker.service.d |
| 119 | + ``` |
| 120 | +2. Create a file called `proxy.conf` in our configuration directory. |
| 121 | + ``` |
| 122 | + sudo vi /etc/systemd/system/docker.service.d/proxy.conf |
| 123 | + ``` |
| 124 | +3. Add the contents similar to the following to the `.conf` file. Change the values to match your environment. |
| 125 | + ``` |
| 126 | + [Service] |
| 127 | + Environment="HTTP_PROXY=http://proxy-hostname:911/" |
| 128 | + Environment="HTTPS_PROXY="http://proxy-hostname:911/ |
| 129 | + Environment="NO_PROXY="10.0.0.0/8,192.168.0.0/16,localhost,127.0.0.0/8,134.134.0.0/16" |
| 130 | + ``` |
| 131 | +4. Save your changes and exit the text editor. |
| 132 | +5. Reload the daemon configuration. |
| 133 | + ``` |
| 134 | + sudo systemctl daemon-reload |
| 135 | + ``` |
| 136 | +6. Restart Docker to apply our changes. |
| 137 | + ``` |
| 138 | + sudo systemctl restart docker.service |
| 139 | + ``` |
| 140 | +
|
| 141 | +## Example Output |
| 142 | +
|
| 143 | +### Output from TensorFlow* Environment |
177 | 144 |
|
178 |
| -1. Create a new directory for our Docker service configurations |
179 |
| -```bash |
180 |
| -sudo mkdir -p /etc/systemd/system/docker.service.d |
181 |
| -``` |
182 |
| -2. Create a file called proxy.conf in our configuration directory. |
183 |
| -```bash |
184 |
| -sudo vi /etc/systemd/system/docker.service.d/proxy.conf |
185 | 145 | ```
|
186 |
| -3. Add the following contents, changing the values to match your environment. |
187 |
| -```bash |
188 |
| -[Service] |
189 |
| -Environment="HTTP_PROXY=http://proxy-hostname:911/" |
190 |
| -Environment="HTTPS_PROXY="http://proxy-hostname:911/ |
191 |
| -Environment="NO_PROXY="10.0.0.0/8,192.168.0.0/16,localhost,127.0.0.0/8,134.134.0.0/16" |
| 146 | +TensorFlow version: 2.6.0 |
| 147 | +MKL enabled : True |
192 | 148 | ```
|
193 |
| -4. Save your changes and exit the text editor. |
194 |
| -5. Reload the daemon configuration |
195 |
| -```bash |
196 |
| -sudo systemctl daemon-reload |
| 149 | +
|
| 150 | +### Output from PyTorch* Environment |
| 151 | +
|
197 | 152 | ```
|
198 |
| -6. Restart Docker to apply our changes |
199 |
| -```bash |
200 |
| -sudo systemctl restart docker.service |
| 153 | +PyTorch Version: 1.8.0a0+37c1f4a |
| 154 | +mkldnn : True, mkl : True, openmp : True |
201 | 155 | ```
|
| 156 | +
|
202 | 157 | ## License
|
203 | 158 |
|
204 | 159 | Code samples are licensed under the MIT license. See
|
205 |
| -[License.txt](https://github.com/oneapi-src/oneAPI-samples/blob/master/License.txt) |
206 |
| -for details. |
| 160 | +[License.txt](https://github.com/oneapi-src/oneAPI-samples/blob/master/License.txt) for details. |
207 | 161 |
|
208 |
| -Third party program Licenses can be found here: [third-party-programs.txt](https://github.com/oneapi-src/oneAPI-samples/blob/master/third-party-programs.txt) |
| 162 | +Third party program Licenses can be found here: [third-party-programs.txt](https://github.com/oneapi-src/oneAPI-samples/blob/master/third-party-programs.txt). |
0 commit comments