課程簡介
1.熟悉和掌握深度學(xué)習(xí)背后的核心原理:從矩陣運算到隨機梯度下降,了解不同深度學(xué)習(xí)模型的原理與優(yōu)缺點。
2.深刻理解深度學(xué)習(xí)模型的關(guān)聯(lián)與邏輯關(guān)系,了解深度學(xué)習(xí)在機器視覺、自然語言處理及其他領(lǐng)域的應(yīng)用。
3.課程將提供學(xué)員經(jīng)過老師精心整理的配套學(xué)習(xí)資料和經(jīng)典論文,在課程的不同階段給學(xué)員用來復(fù)習(xí)和學(xué)習(xí)。
4.課程采用雙語形式,課件為英文,用中文講授,為直接學(xué)習(xí)國際最前沿的技術(shù)論文打下良好基礎(chǔ)。
目標收益
培訓(xùn)對象
課程大綱
機器智能的數(shù)學(xué) Mathematics for Machine Intelligence |
1.數(shù)與函數(shù) Numbers and Functions 2.線性代數(shù) Linear Algebra 3.概率與統(tǒng)計 Probability and Statistics |
廣義線性模型 Generalized Linear Model |
1.線性模型 Linear Model 2.梯度下降 Gradient Descent 3.廣義線性模型 Generalized Linear Model |
經(jīng)典神經(jīng)網(wǎng)絡(luò) Classical Neural Networks |
1.前饋網(wǎng)絡(luò) Feed-forward 2.反向傳播 Back-propagation 3.通用函數(shù) Universal Function |
卷積神經(jīng)網(wǎng)絡(luò) Convolution Neural Networks |
1.視覺特征的卷積 Convolution of Visual Features 2.卷積網(wǎng)絡(luò)結(jié)構(gòu)與卷積技巧 CNN Structure and Conv Tricks 3.各種卷積網(wǎng)絡(luò): 從LeNet到ResNet CNN Variants: from LeNet to ResNet |
卷積網(wǎng)絡(luò)與應(yīng)用 CNN and Applications |
1.分割、識別與檢測 Segmentation, Recognition and Detection 2.R-CNN、動作識別 R-CNN、Post Estimation 3.U-網(wǎng)絡(luò)與Pix2Pix模型 U-Net and Pix2Pix Model |
循環(huán)神經(jīng)網(wǎng)絡(luò) Recurrent Neural Network |
1.循環(huán)網(wǎng)絡(luò) Recurrent Network 2.長短記憶網(wǎng)絡(luò) Long Short-term Memory (LSTM) 3.神經(jīng)語言模型與詞嵌入 Neural Language Model and Word Embeddings 4.注意力機制 Attention Mechanism |
Transformer與自然語言處理 Transformer for NLP |
1.模型詳解 Model Explained 2. Transformer應(yīng)用 Applications of Transformer 3.深度生成模型 Deep Generative Model |
機器智能的數(shù)學(xué) Mathematics for Machine Intelligence 1.數(shù)與函數(shù) Numbers and Functions 2.線性代數(shù) Linear Algebra 3.概率與統(tǒng)計 Probability and Statistics |
廣義線性模型 Generalized Linear Model 1.線性模型 Linear Model 2.梯度下降 Gradient Descent 3.廣義線性模型 Generalized Linear Model |
經(jīng)典神經(jīng)網(wǎng)絡(luò) Classical Neural Networks 1.前饋網(wǎng)絡(luò) Feed-forward 2.反向傳播 Back-propagation 3.通用函數(shù) Universal Function |
卷積神經(jīng)網(wǎng)絡(luò) Convolution Neural Networks 1.視覺特征的卷積 Convolution of Visual Features 2.卷積網(wǎng)絡(luò)結(jié)構(gòu)與卷積技巧 CNN Structure and Conv Tricks 3.各種卷積網(wǎng)絡(luò): 從LeNet到ResNet CNN Variants: from LeNet to ResNet |
卷積網(wǎng)絡(luò)與應(yīng)用 CNN and Applications 1.分割、識別與檢測 Segmentation, Recognition and Detection 2.R-CNN、動作識別 R-CNN、Post Estimation 3.U-網(wǎng)絡(luò)與Pix2Pix模型 U-Net and Pix2Pix Model |
循環(huán)神經(jīng)網(wǎng)絡(luò) Recurrent Neural Network 1.循環(huán)網(wǎng)絡(luò) Recurrent Network 2.長短記憶網(wǎng)絡(luò) Long Short-term Memory (LSTM) 3.神經(jīng)語言模型與詞嵌入 Neural Language Model and Word Embeddings 4.注意力機制 Attention Mechanism |
Transformer與自然語言處理 Transformer for NLP 1.模型詳解 Model Explained 2. Transformer應(yīng)用 Applications of Transformer 3.深度生成模型 Deep Generative Model |