Skip to content

Commit 3b05b0f

Browse files
jkinskyjimmytweikrzeszewalexsin368ZhaoqiongZ
authored
Intel® AI Analytics Toolkit (AI Kit) Container Getting Started sample readme update (#1492)
* Fixes for 2023.1 AI Kit (#1409) * Intel Python Numpy Numba_dpes kNN sample (#1292) * *.py and *.ipynb files with implementation * README.md and sample.json files with documentation * License and thir party programs * Adding PyTorch Training Optimizations with AMX BF16 oneAPI sample (#1293) * add IntelPytorch Quantization code samples (#1301) * add IntelPytorch Quantization code samples * fix the spelling error in the README file * use john's README with grammar fix and title change * Rename third-party-grograms.txt to third-party-programs.txt Co-authored-by: Jimmy Wei <[email protected]> * AMX bfloat16 mixed precision learning TensorFlow Transformer sample (#1317) * [New Sample] Intel Extension for TensorFlow Getting Started (#1313) * first draft * Update README.md * remove redunant file * [New Sample] [oneDNN] Benchdnn tutorial (#1315) * New Sample: benchDNN tutorial * Update readme: new sample * Rename sample to benchdnn_tutorial * Name fix * Add files via upload (#1320) * [New Sample] oneCCL Bindings for PyTorch Getting Started (#1316) * Update README.md * [New Sample] oneCCL Bindings for PyTorch Getting Started * Update README.md * add torch-ccl version check * [New Sample] Intel Extension for PyTorch Getting Started (#1314) * add new ipex GSG notebook for dGPU * Update sample.json for expertise field * Update requirements.txt Update package versions to comply with Snyk tool * Updated title field in sample.json in TF Transformer AMX bfloat16 Mixed Precision sample to fit within character length range (#1327) * add arch checker class (#1332) * change gpu.patch to convert the code samples from cpu to gpu correctly (#1334) * Fixes for spelling in AMX bfloat16 transformer sample and printing error in python code in numpy vs numba sample (#1335) * 2023.1 ai kit itex get started example fix (#1338) * Fix the typo * Update ResNet50_Inference.ipynb * fix resnet inference demo link (#1339) * Fix printing issue in numpy vs numba AI sample (#1356) * Fix Invalid Kmeans parameters on oneAPI 2023 (#1345) * Update README to add new samples into the list (#1366) * PyTorch AMX BF16 Training sample: remove graphs and performance numbers (#1408) * Adding PyTorch Training Optimizations with AMX BF16 oneAPI sample * remove performance graphs, update README * remove graphs from README and folder * update top README in Features and Functionality --------- Co-authored-by: krzeszew <[email protected]> Co-authored-by: alexsin368 <[email protected]> Co-authored-by: ZhaoqiongZ <[email protected]> Co-authored-by: Louie Tsai <[email protected]> Co-authored-by: Orel Yehuda <[email protected]> Co-authored-by: yuning <[email protected]> Co-authored-by: Wang, Kai Lawrence <[email protected]> Co-authored-by: xiguiw <[email protected]> * readme up text --------- Co-authored-by: Jimmy Wei <[email protected]> Co-authored-by: krzeszew <[email protected]> Co-authored-by: alexsin368 <[email protected]> Co-authored-by: ZhaoqiongZ <[email protected]> Co-authored-by: Louie Tsai <[email protected]> Co-authored-by: Orel Yehuda <[email protected]> Co-authored-by: yuning <[email protected]> Co-authored-by: Wang, Kai Lawrence <[email protected]> Co-authored-by: xiguiw <[email protected]>
1 parent 986820c commit 3b05b0f

File tree

2 files changed

+104
-150
lines changed

2 files changed

+104
-150
lines changed
Original file line numberDiff line numberDiff line change
@@ -1,208 +1,162 @@
1-
# Intel&reg; AI Analytics Toolkit (AI Kit) Container Sample
1+
# `Intel® AI Analytics Toolkit (AI Kit) Container Getting Started` Sample
22

3-
Containers allow you to set up and configure environments for
4-
building, running, and profiling AI applications and distribute
5-
them using images. You can also use Kubernetes* to automate the
6-
deployment and management of containers in the cloud.
3+
The `Intel® AI Analytics Toolkit (AI Kit) Container Getting Started` sample demonstrates how to use AI Kit containers.
74

8-
This get started sample shows the easiest way to start using any of
9-
the [Intel® AI Analytics
10-
Toolkit (AI Kit)](https://www.intel.com/content/www/us/en/developer/tools/oneapi/ai-analytics-toolkit.html)
11-
components without the hassle of installing the toolkit, configuring
12-
networking and file sharing.
5+
| Area | Description
6+
|:--- |:---
7+
| What you will learn | How to start using the Intel® oneapi-aikit container
8+
| Time to complete | 10 minutes
9+
| Category | Tutorial
1310

14-
15-
| Optimized for | Description
16-
|:--- |:---
17-
| OS | Linux* Ubuntu* 18.04
18-
| Hardware | Intel® Xeon® Scalable processor family or newer
19-
| Software | Intel® AI Analytics Toolkit
20-
| What you will learn | How to start using the Intel® oneapi-aikit container
21-
| Time to complete | 10 minutes
11+
For more information on the **oneapi-aikit** container, see [Intel AI Analytics Toolkit](https://hub.docker.com/r/intel/oneapi-aikit) Docker Hub location.
2212

2313
## Purpose
2414

25-
This sample provides a Bash script to help users configure their Intel&reg; AI Analytics Toolkit
26-
container environment. Developers can
27-
quickly build and train deep learning models using this Docker*
28-
environment.
15+
This sample provides a Bash script to help you configure an AI Kit container environment. You can build and train deep learning models using this Docker* environment.
16+
17+
Containers allow you to set up and configure environments for building, running, and profiling AI applications and distribute them using images. You can also use Kubernetes* to automate the deployment and management of containers in the cloud.
18+
19+
Read the [Get Started with the Intel® AI Analytics Toolkit for Linux*](https://www.intel.com/content/www/us/en/develop/documentation/get-started-with-ai-linux/top.html) to find out how you can achieve performance gains for popular deep-learning and machine-learning frameworks through Intel optimizations.
20+
21+
This sample shows an easy way to start using any of the [Intel® AI Analytics Toolkit (AI Kit)](https://www.intel.com/content/www/us/en/developer/tools/oneapi/ai-analytics-toolkit.html) components without the hassle of installing the toolkit, configuring networking and file sharing.
2922

30-
For more information on the one API AIKit container, see [AI Kit
31-
Container Repository](https://hub.docker.com/r/intel/oneapi-aikit).
23+
## Prerequisites
3224

25+
| Optimized for | Description
26+
|:--- |:---
27+
| OS | Ubuntu* 20.04 (or newer)
28+
| Hardware | Intel® Xeon® Scalable processor family
29+
| Software | Intel® AI Analytics Toolkit (AI Kit)
3330

3431
## Key Implementation Details
3532

3633
The Bash script provided in this sample performs the following
3734
configuration steps:
3835

39-
- Mounts the `/home` folder from host machine into the Docker
40-
container. You can share files between the host machine and the
41-
Docker container via the `/home` folder.
36+
- Mounts the `/home` folder from host machine into the Docker container. You can share files between the host machine and the Docker container through the `/home` folder.
4237

43-
- Applies proxy settings from the host machine into the Docker
44-
container.
45-
46-
- Uses the same IP addresses between the host machine and the Docker
47-
container.
38+
- Applies proxy settings from the host machine into the Docker container.
4839

49-
- Forwards ports 8888, 6006, 6543, and 12345 from the host machine to
50-
the Docker container for some popular network services, such as
51-
Jupyter notebook and TensorBoard.
52-
40+
- Uses the same IP addresses between the host machine and the Docker container.
5341

54-
## Run the Sample
42+
- Forwards ports 8888, 6006, 6543, and 12345 from the host machine to the Docker container for some popular network services, such as Jupyter* Notebook and TensorFlow* TensorBoard.
43+
44+
## Run the `Intel® AI Analytics Toolkit (AI Kit) Container Getting Started` Sample
5545

56-
This sample uses a configuration script to automatically configure the
57-
environment. This provides fast and less error prone setup. For
58-
complete instructions for using the AI Kit containers see
59-
the [Getting Started Guide](https://www.intel.com/content/www/us/en/develop/documentation/get-started-with-ai-linux/top/using-containers.html.)
46+
This sample uses a configuration script to automatically configure the environment. This provides fast and less error prone setup. For complete instructions for using the AI Kit containers, see the [Getting Started Guide](https://www.intel.com/content/www/us/en/develop/documentation/get-started-with-ai-linux/top/using-containers.html).
6047

61-
To run the configuration script on Linux*, type the following command
62-
in the terminal with [Docker](https://docs.docker.com/engine/install/)
63-
installed:
48+
### On Linux*
6449

50+
You must have [Docker](https://docs.docker.com/engine/install/)
51+
installed.
6552

66-
1. Navigate to the directory with the IntelAIKitContainer sample and pull the oneapi-aikit docker image:
67-
68-
```
69-
docker pull intel/oneapi-aikit
70-
```
71-
> Please apply the below command and login again if a permisson denied error occurs.
72-
```
73-
sudo usermod -aG docker $USER
74-
```
75-
76-
2. Use the `run_oneapi_docker.sh` Bash script to run the Docker image:
53+
1. Open a terminal.
54+
2. Change to the sample folder, and pull the oneapi-aikit Docker image.
55+
```
56+
docker pull intel/oneapi-aikit
57+
```
58+
>**Note**: If a permission denied error occurs, run the following command.
59+
>```
60+
>sudo usermod -aG docker $USER
61+
>```
7762
78-
```bash
63+
3. Run the Docker images using the `run_oneapi_docker.sh` Bash script.
64+
```
7965
./run_oneapi_docker.sh intel/oneapi-aikit
8066
```
81-
8267
The script opens a Bash shell inside the Docker container.
83-
> Note : Users could install additional packages by adding them into requirements.txt.
84-
> Please copy the modified requirements.txt into /tmp folder, so the bash script will install those packages for you.
85-
86-
To create a new Bash session in the running container from outside
87-
the Docker container, use the following:
88-
89-
```bash
68+
> **Note**: Install additional packages by adding them into requirements.txt file in the sample. Copy the modified requirements.txt into /tmp folder, so the bash script will install those packages for you.
69+
70+
To create a Bash session in the running container from outside the Docker container, enter a command similar to the following.
71+
```
9072
docker exec -it aikit_container /bin/bash
9173
```
92-
93-
3. In the Bash shell inside the Docker container, activate the oneAPI
94-
environment:
95-
96-
```bash
74+
4. In the Bash shell inside the Docker container, activate the specialized environment.
75+
```
9776
source activate tensorflow
9877
```
99-
10078
or
101-
102-
```bash
79+
```
10380
source activate pytorch
10481
```
105-
106-
Now you can start using Intel® Optimization for TensorFlow* or Intel
107-
Optimization for PyTorch inside the Docker container.
108-
109-
To verify the activated environment, navigate to the directory with
110-
the IntelAIKitContainer sample and run the `version_check.py` script:
111-
112-
```bash
113-
python version_check.py
114-
```
82+
You can start using Intel® Optimization for TensorFlow* or Intel® Optimization for PyTorch* inside the Docker container.
11583
116-
## Example of Output
117-
118-
Output from TensorFlow Environment
119-
```
120-
TensorFlow version: 2.6.0
121-
MKL enabled : True
122-
```
123-
124-
Output from PyTorch Environment
125-
```
126-
PyTorch Version: 1.8.0a0+37c1f4a
127-
mkldnn : True, mkl : True, openmp : True
128-
```
129-
130-
131-
132-
## Next Steps
133-
134-
Explore the [Get Started
135-
Guide](https://www.intel.com/content/www/us/en/develop/documentation/get-started-with-ai-linux/top.html).
136-
to find out how you can achieve performance gains for popular
137-
deep-learning and machine-learning frameworks through Intel
138-
optimizations.
84+
>**Note**: You can verify the activated environment. Change to the directory with the IntelAIKitContainer sample and run the `version_check.py` script.
85+
>```
86+
>python version_check.py
87+
>```
13988
14089
### Manage Docker* Images
14190
142-
You can install additional packages, upload the workloads via the
143-
`/tmp` folder, and then commit your changes into a new Docker image,
144-
for example, `intel/oneapi-aikit-v1`:
145-
146-
147-
```bash
91+
You can install additional packages, upload the workloads via the `/tmp` folder, and then commit your changes into a new Docker image, for example, `intel/oneapi-aikit-v1`.
92+
```
14893
docker commit -a "intel" -m "test" DOCKER_ID intel/oneapi-aikit-v1
14994
```
95+
>**Note**: Replace `DOCKER_ID` with the ID of your container. Use `docker ps` to get the DOCKER_ID of your Docker container.
15096
151-
**NOTE:** Replace `DOCKER_ID` with the ID of your container. Use
152-
`docker ps` to get the DOCKER_ID of your Docker container.
153-
154-
You can then use the new image name to start Docker:
155-
156-
```bash
97+
You can use the new image name to start Docker.
98+
```
15799
./run_oneapi_docker.sh intel/oneapi-aikit-v1
158100
```
159101
160-
To save the Docker image as a tar file:
161-
162-
```bash
102+
You can save the Docker image as a tar file.
103+
```
163104
docker save -o oneapi-aikit-v1.tar intel/oneapi-aikit-v1
164105
```
165106
166-
To load the tar file on other machines:
167-
168-
```bash
107+
You can load the tar file on other machines.
108+
```
169109
docker load -i oneapi-aikit-v1.tar
170110
```
171-
## Troubleshooting
172111
173112
### Docker Proxy
174113
175-
#### Ubuntu
176-
For docker proxy related problem, you could follow below instructions to setup proxy for your docker client.
114+
For Docker proxy related problem, you could follow below instructions to configure proxy settings for your Docker client.
115+
116+
1. Create a directory for the Docker service configurations.
117+
```
118+
sudo mkdir -p /etc/systemd/system/docker.service.d
119+
```
120+
2. Create a file called `proxy.conf` in our configuration directory.
121+
```
122+
sudo vi /etc/systemd/system/docker.service.d/proxy.conf
123+
```
124+
3. Add the contents similar to the following to the `.conf` file. Change the values to match your environment.
125+
```
126+
[Service]
127+
Environment="HTTP_PROXY=http://proxy-hostname:911/"
128+
Environment="HTTPS_PROXY="http://proxy-hostname:911/
129+
Environment="NO_PROXY="10.0.0.0/8,192.168.0.0/16,localhost,127.0.0.0/8,134.134.0.0/16"
130+
```
131+
4. Save your changes and exit the text editor.
132+
5. Reload the daemon configuration.
133+
```
134+
sudo systemctl daemon-reload
135+
```
136+
6. Restart Docker to apply our changes.
137+
```
138+
sudo systemctl restart docker.service
139+
```
140+
141+
## Example Output
142+
143+
### Output from TensorFlow* Environment
177144
178-
1. Create a new directory for our Docker service configurations
179-
```bash
180-
sudo mkdir -p /etc/systemd/system/docker.service.d
181-
```
182-
2. Create a file called proxy.conf in our configuration directory.
183-
```bash
184-
sudo vi /etc/systemd/system/docker.service.d/proxy.conf
185145
```
186-
3. Add the following contents, changing the values to match your environment.
187-
```bash
188-
[Service]
189-
Environment="HTTP_PROXY=http://proxy-hostname:911/"
190-
Environment="HTTPS_PROXY="http://proxy-hostname:911/
191-
Environment="NO_PROXY="10.0.0.0/8,192.168.0.0/16,localhost,127.0.0.0/8,134.134.0.0/16"
146+
TensorFlow version: 2.6.0
147+
MKL enabled : True
192148
```
193-
4. Save your changes and exit the text editor.
194-
5. Reload the daemon configuration
195-
```bash
196-
sudo systemctl daemon-reload
149+
150+
### Output from PyTorch* Environment
151+
197152
```
198-
6. Restart Docker to apply our changes
199-
```bash
200-
sudo systemctl restart docker.service
153+
PyTorch Version: 1.8.0a0+37c1f4a
154+
mkldnn : True, mkl : True, openmp : True
201155
```
156+
202157
## License
203158
204159
Code samples are licensed under the MIT license. See
205-
[License.txt](https://github.com/oneapi-src/oneAPI-samples/blob/master/License.txt)
206-
for details.
160+
[License.txt](https://github.com/oneapi-src/oneAPI-samples/blob/master/License.txt) for details.
207161
208-
Third party program Licenses can be found here: [third-party-programs.txt](https://github.com/oneapi-src/oneAPI-samples/blob/master/third-party-programs.txt)
162+
Third party program Licenses can be found here: [third-party-programs.txt](https://github.com/oneapi-src/oneAPI-samples/blob/master/third-party-programs.txt).

AI-and-Analytics/Getting-Started-Samples/IntelAIKitContainer_GettingStarted/sample.json

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
{
22
"guid": "0F95DA9E-0A5D-4CF2-B791-885B09675004",
3-
"name": "IntelAIKitContainer_GettingStarted",
3+
"name": "Intel(R) AI Analytics Toolkit (AI Kit) Container Getting Started",
44
"categories": ["Toolkit/oneAPI AI And Analytics/AI Getting Started Samples"],
55
"description": "This sample illustrates how to utilize the oneAPI AI Kit container.",
66
"builder": ["cli"],

0 commit comments

Comments
 (0)