常用英语词汇 -andrew Ng | 课程 | average firing rate | 均匀激活率 | |||||||||||
intensity | 强度 | average sum-of-squares error | 均方差 | |||||||||||
Regression | 回归 | backpropagation | 后向流传 | |||||||||||
Loss function | 损失函数 | basis 基 | ||||||||||||
non-convex | 非凸函数 | basis feature vectors | 特点基向量 | |||||||||||
neural network | 神经网络 | batch gradient ascent | 批量梯度上涨法 | |||||||||||
supervised learning | 监察学习 | Bayesian regularization method | 贝叶斯规则化方法 | |||||||||||
regression problem | 回归问题办理的是连续的问题 | Bernoulli random variable | 伯努利随机变量 | |||||||||||
classification problem | 分类问题 | bias term | 偏置项 | |||||||||||
discreet value | 失散值 | binary classfication | 二元分类正则化英语 | |||||||||||
support vector machines | 支持向量机 | class labels | 种类标记 | |||||||||||
learning theory | 学习理论 | concatenation | 级联 | |||||||||||
learning algorithms | 学习算法 | conjugate gradient | 共轭梯度 | |||||||||||
unsupervised learning | 无监察学习 | contiguous groups | 联通地区 | |||||||||||
gradient descent | 梯度降落 | convex optimization software | 凸优化软件 | |||||||||||
linear regression | 线性回归 | convolution | 卷积 | |||||||||||
Neural Network | 神经网络 | cost function | 代价函数 | |||||||||||
gradient descent | 梯度降落 | covariance matrix | 协方差矩阵 | |||||||||||
normal equations | DC component | 直流重量 | ||||||||||||
linear algebra | 线性代数 | decorrelation | 去有关 | |||||||||||
superscript | 上标 | degeneracy | 退化 | |||||||||||
exponentiation | 指数 | demensionality reduction | 降维 | |||||||||||
training set | 训练会合 | derivative | 导函数 | |||||||||||
training example | 训练样本 | diagonal | 对角线 | |||||||||||
hypothesis | 假定,用来表示学习算法的输出 | diffusion of gradients | 梯度的弥散 | |||||||||||
LMS algorithm “least mean squares | 最小二乘法算 | eigenvalue | 特点值 | |||||||||||
法 | eigenvector | 特点向量 | ||||||||||||
batch gradient descent | 批量梯度降落 | error term | 残差 | |||||||||||
constantly gradient descent | 随机梯度降落 | feature matrix | 特点矩阵 | |||||||||||
iterative algorithm | 迭代算法 | feature standardization | 特点标准化 | |||||||||||
partial derivative | 偏导数 | feedforward architectures | 前馈构造算法 | |||||||||||
contour | 等高线 | feedforward neural network | 前馈神经网络 | |||||||||||
quadratic function | 二元函数 | feedforward pass | 前馈传导 | |||||||||||
locally weighted regression | 局部加权回归 | fine-tuned | 微调 | |||||||||||
underfitting | 欠拟合 | first-order feature | 一阶特点 | |||||||||||
overfitting | 过拟合 | forward pass | 前向传导 | |||||||||||
non-parametric learning algorithms | 无参数学习算 | forward propagation | 前向流传 | |||||||||||
法 | Gaussian prior | 高斯先验概率 | ||||||||||||
parametric learning algorithm | 参数学习算法 | generative model | 生成模型 | |||||||||||
activation | 激活值 | gradient descent | 梯度降落 | |||||||||||
activation function | 激活函数 | Greedy layer-wise training | 逐层贪心训练方法 | |||||||||||
additive noise | 加性噪声 | grouping matrix | 分组矩阵 | |||||||||||
autoencoder | 自编码器 | Hadamard product | 阿达马乘积 | |||||||||||
Autoencoders | 自编码算法 | Hessian matrix Hessian | 矩阵 | |||||||||||
hidden layer | 隐含层 | ||||
hidden units | 隐蔽神经元 | ||||
Hierarchical grouping | 层次型分组 | ||||
higher-order features | 更高阶特点 | ||||
highly non-convex optimization problem | |||||
高度非凸的优化问题 | |||||
histogram | 直方图 | ||||
hyperbolic tangent | 双曲正切函数 | ||||
hypothesis | 估值,假定 | ||||
identity activation function | 恒等激励函数 | ||||
IID 独立同散布 | |||||
illumination | 照明 | ||||
inactive | 克制 | ||||
independent component analysis | 独立成份剖析 | ||||
input domains | 输入域 | ||||
input layer | 输入层 | ||||
intensity | 亮度/灰度 | ||||
intercept term | 截距 | ||||
KL divergence | 相对熵 | ||||
KL divergence KL | 分别度 | ||||
k-Means K- | 均值 | ||||
learning rate | 学习速率 | ||||
least squares | 最小二乘法 | ||||
linear correspondence | 线性响应 | ||||
linear superposition | 线性叠加 | ||||
line-search algorithm | 线搜寻算法 | ||||
local mean subtraction | 局部均值消减 | ||||
local optima | 局部最优解 | ||||
logistic regression | 逻辑回归 | ||||
loss function | 损失函数 | ||||
low-pass filtering | 低通滤波 | ||||
magnitude | 幅值 | ||||
MAP 极大后验预计 | |||||
maximum likelihood estimation | 极大似然预计 | ||||
mean 均匀值 | |||||
MFCC Mel 倒频系数 | |||||
multi-class classification | 多元分类 | ||||
neural networks | 神经网络 | ||||
neuron 神经元 | |||||
Newton’s method | 牛顿法 | ||||
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系QQ:729038198,我们将在24小时内删除。
发表评论