人工智能專用名詞_第1頁(yè)
人工智能專用名詞_第2頁(yè)
人工智能專用名詞_第3頁(yè)
人工智能專用名詞_第4頁(yè)
人工智能專用名詞_第5頁(yè)
已閱讀5頁(yè),還剩19頁(yè)未讀 繼續(xù)免費(fèi)閱讀

下載本文檔

版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請(qǐng)進(jìn)行舉報(bào)或認(rèn)領(lǐng)

文檔簡(jiǎn)介

LetterAAccumulatederrorbackpropagation累積誤差逆?zhèn)鞑ctivationFunction激活函數(shù)AdaptiveResonanceTheory/ART自適應(yīng)諧振理論Addictivemodel加性學(xué)習(xí)AdversarialNetworks對(duì)抗網(wǎng)絡(luò)AffineLayer仿射層Affinitymatrix親和矩陣Agent代理/智能體Algorithm算法Alpha-betapruninga-p剪枝Anomalydetection異常檢測(cè)Approximation近似AreaUnderROCCurve/AUCRoc曲線下面積ArtificialGeneralIntelligence/AGI通用人工智能ArtificialIntelligence/AI人工智能Associationanalysis關(guān)聯(lián)分析Attentionmechanism注意力機(jī)制Attributeconditionalindependenceassumption屬性條件獨(dú)立性假設(shè)Attributespace屬性空間Attributevalue屬性值A(chǔ)utoencoder自編碼器Automaticspeechrecognition自動(dòng)語音識(shí)別Automaticsummarization自動(dòng)摘要Averagegradient平均梯度Average-Pooling平均池化LetterBBackpropagationThroughTimeBackpropagation/BPBaselearnerBaselearningalgorithmBatchNormalization/BNBayesdecisionruleBayesModelAveraging/BMABayesoptimalclassifierBayesiandecisiontheoryBayesiannetworkBetween-classscattermatrixBiasBias-variancedecompositionBias-VarianceDilemmaBi-directionalLong-ShortTermMemory/Bi-LSTMBinaryclassificationBinomialtestBi-partitionBoltzmannmachineBootstrapsamplingBootstrappingBreak-EventPoint/BEPLetterCCalibrationCascade-CorrelationCategoricalattribute通過時(shí)間的反向傳播反向傳播基學(xué)習(xí)器基學(xué)習(xí)算法批量歸一化貝葉斯判定準(zhǔn)則貝葉斯模型平均貝葉斯最優(yōu)分類器貝葉斯決策論貝葉斯網(wǎng)絡(luò)類間散度矩陣偏置/偏差偏差-方差分解偏差-方差困境雙向長(zhǎng)短期記憶二分類二項(xiàng)檢驗(yàn)二分法玻爾茲曼機(jī)自助采樣法/可重復(fù)采樣/有放回采樣自助法平衡點(diǎn)校準(zhǔn)級(jí)聯(lián)相關(guān)離散屬性Class-conditionalprobability類條件概率Classificationandregression分類與回歸樹tree/CARTClassifier分類器Class-imbalance類別不平衡Closed-form閉式Cluster簇/類/集群Clusteranalysis聚類分析Clustering聚類Clusteringensemble聚類集成Co-adapting共適應(yīng)Codingmatrix編碼矩陣COLT國(guó)際學(xué)習(xí)理論會(huì)議Committee-basedlearning基于委員會(huì)的學(xué)習(xí)Competitivelearning競(jìng)爭(zhēng)型學(xué)習(xí)Componentlearner組件學(xué)習(xí)器Comprehensibility可解釋性ComputationCost計(jì)算成本ComputationalLinguistics計(jì)算語言學(xué)Computervision計(jì)算機(jī)視覺Conceptdrift概念漂移ConceptLearningSystem/CLS概念學(xué)習(xí)系統(tǒng)Conditionalentropy條件熵Conditionalmutualinformation條件互信息ConditionalProbabilityTable/條件概率表CPTConditionalrandomfield/CRF條件隨機(jī)場(chǎng)Conditionalrisk條件風(fēng)險(xiǎn)

Confidence置信度Confusionmatrix混淆矩陣Connectionweight連接權(quán)Connectionism連結(jié)主義Consistency一致性/相合性Contingencytable列聯(lián)表Continuousattribute連續(xù)屬性Convergence收斂Conversationalagent會(huì)話智能體Convexquadraticprogramming凸二次規(guī)劃Convexity凸性Convolutionalneural卷積神經(jīng)網(wǎng)絡(luò)network/CNNCo-occurrence同現(xiàn)Correlationcoefficient相關(guān)系數(shù)Cosinesimilarity余弦相似度Costcurve成本曲線CostFunction成本函數(shù)Costmatrix成本矩陣Cost-sensitive成本敏感Crossentropy交叉熵Crossvalidation交叉驗(yàn)證Crowdsourcing眾包Curseofdimensionality維數(shù)災(zāi)難Cutpoint截?cái)帱c(diǎn)Cuttingplanealgorithm割平面法LetterDDatamining數(shù)據(jù)挖掘DatasetDecisionBoundaryDecisionstumpDecisiontreeDeductionDeepBeliefNetworkDeepConvolutionalGenerativeAdversarialNetwork/DCGANDeeplearningDeepneuralnetwork/DNNDeepQ-LearningDeepQ-NetworkDensityestimationDensity-basedclusteringDifferentiableneuralcomputerDimensionalityreductionalgorithmDirectededgeDisagreementmeasureDiscriminativemodelDiscriminatorDistancemeasureDistancemetriclearningDistributionDivergenceDiversitymeasureDomainadaptionDownsampling數(shù)據(jù)集決策邊界決策樹樁決策樹/判定樹演繹深度信念網(wǎng)絡(luò)深度卷積生成對(duì)抗網(wǎng)絡(luò)深度學(xué)習(xí)深度神經(jīng)網(wǎng)絡(luò)深度Q學(xué)習(xí)深度Q網(wǎng)絡(luò)密度估計(jì)密度聚類可微分神經(jīng)計(jì)算機(jī)降維算法有向邊不合度量判別模型判別器距離度量距離度量學(xué)習(xí)分布散度多樣性度量/差異性度量領(lǐng)域自適應(yīng)下采樣D-separation(Directedseparation)DualproblemDummynodeDynamicFusionDynamicprogrammingLetterEEigenvaluedecompositionEmbeddingEmotionalanalysisEmpiricalconditionalentropyEmpiricalentropyEmpiricalerrorEmpiricalriskEnd-to-EndEnergy-basedmodelEnsemblelearningEnsemblepruningErrorCorrectingOutputCodes/ECOCErrorrateError-ambiguitydecompositionEuclideandistanceEvolutionarycomputationExpectation-MaximizationExpectedlossExplodingGradientProblemExponentiallossfunction有向分離對(duì)偶問題啞結(jié)點(diǎn)動(dòng)態(tài)融合動(dòng)態(tài)規(guī)劃特征值分解嵌入情緒分析經(jīng)驗(yàn)條件熵經(jīng)驗(yàn)熵經(jīng)驗(yàn)誤差經(jīng)驗(yàn)風(fēng)險(xiǎn)端到端基于能量的模型集成學(xué)習(xí)集成修剪糾錯(cuò)輸出碼錯(cuò)誤率誤差-分歧分解歐氏距離演化計(jì)算期望最大化期望損失梯度爆炸問題指數(shù)損失函數(shù)ExtremeLearningMachine/ELM超限學(xué)習(xí)機(jī)LetterFFactorization因子分解Falsenegative假負(fù)類Falsepositive假正類FalsePositiveRate/FPR假正例率Featureengineering特征工程Featureselection特征選擇Featurevector特征向量FeaturedLearning特征學(xué)習(xí)FeedforwardNeural前饋神經(jīng)網(wǎng)絡(luò)Networks/FNNFine-tuning微調(diào)Flippingoutput翻轉(zhuǎn)法Fluctuation震蕩Forwardstagewisealgorithm前向分步算法Frequentist頻率主義學(xué)派Full-rankmatrix滿秩矩陣Functionalneuron功能神經(jīng)元LetterGGainratio增益率Gametheory博弈論Gaussiankernelfunction高斯核函數(shù)GaussianMixtureModel高斯混合模型GeneralProblemSolving通用問題求解Generalization泛化Generalizationerror泛化誤差Generalizationerrorbound泛化誤差上界

GeneralizedLagrangefunction廣義拉格朗日函數(shù)Generalizedlinearmodel廣義線性模型GeneralizedRayleighquotient廣義瑞利商GenerativeAdversarial生成對(duì)抗網(wǎng)絡(luò)Networks/GANGenerativeModel生成模型Generator生成器GeneticAlgorithm/GA遺傳算法Gibbssampling吉布斯采樣Giniindex基尼指數(shù)Globalminimum全局最小GlobalOptimization全局優(yōu)化Gradientboosting梯度提升GradientDescent梯度下降Graphtheory圖論Ground-truth真相/真實(shí)LetterHHardmargin硬間隔Hardvoting硬投票Harmonicmean調(diào)和平均Hessematrix海塞矩陣Hiddendynamicmodel隱動(dòng)態(tài)模型Hiddenlayer隱藏層HiddenMarkovModel/HMM隱馬爾可夫模型Hierarchicalclustering層次聚類Hilbertspace希爾伯特空間Hingelossfunction合頁(yè)損失函數(shù)Hold-out留出法

Homogeneous同質(zhì)Hybridcomputing混合計(jì)算Hyperparameter超參數(shù)Hypothesis假設(shè)Hypothesistest假設(shè)驗(yàn)證LetterIICML國(guó)際機(jī)器學(xué)習(xí)會(huì)議Improvediterativescaling/IIS改進(jìn)的迭代尺度法Incrementallearning增量學(xué)習(xí)Independentandidentically獨(dú)立同分布distributed/i.i.d.IndependentComponent獨(dú)立成分分析Analysis/ICAIndicatorfunction指示函數(shù)Individuallearner個(gè)體學(xué)習(xí)器Induction歸納Inductivebias歸納偏好Inductivelearning歸納學(xué)習(xí)InductiveLogicProgramming/ILP歸納邏輯程序設(shè)計(jì)Informationentropy信息熵Informationgain信息增益Inputlayer輸入層Insensitiveloss不敏感損失Inter-clustersimilarity簇間相似度InternationalConferenceforMachine國(guó)際機(jī)器學(xué)習(xí)大會(huì)Learning/ICMLIntra-clustersimilarity簇內(nèi)相似度Intrinsicvalue固有值IsometricMapping/IsomapIsotonicregressionIterativeDichotomiserLetterKKernelmethodKerneltrickKernelizedLinearDiscriminantAnalysis/KLDAK-foldcrossvalidationK-MeansClusteringK-NearestNeighboursAlgorithm/KNNKnowledgebaseKnowledgeRepresentationLetterLLabelspaceLagrangedualityLagrangemultiplierLaplacesmoothingLaplaciancorrectionLatentDirichletAllocationLatentsemanticanalysisLatentvariableLazylearningLearnerLearningbyanalogyLearningrateLearningVectorQuantization/LVQ等度量映射等分回歸迭代二分器核方法核技巧核線性判別分析k折交叉驗(yàn)證/k倍交叉驗(yàn)證K-均值聚類K近鄰算法知識(shí)庫(kù)知識(shí)表征標(biāo)記空間拉格朗日對(duì)偶性拉格朗日乘子拉普拉斯平滑拉普拉斯修正隱狄利克雷分布潛在語義分析隱變量懶惰學(xué)習(xí)學(xué)習(xí)器類比學(xué)習(xí)學(xué)習(xí)率學(xué)習(xí)向量量化線性判別分析線性判別分析線性模型線性回歸聯(lián)系函數(shù)局部馬爾可夫性局部最小對(duì)數(shù)似然對(duì)數(shù)幾率Logistic回歸對(duì)數(shù)似然對(duì)數(shù)線性回歸長(zhǎng)短期記憶損失函數(shù)機(jī)器翻譯宏查準(zhǔn)率宏查全率絕對(duì)多數(shù)投票法流形假設(shè)流形學(xué)習(xí)間隔理論邊際分布邊際獨(dú)立性邊際化馬爾可夫鏈蒙特卡羅方法Leastsquaresregressiontree最小二乘回歸樹Leave-One-Out/LOO留一法linearchainconditionalrandomfield線性鏈條件隨機(jī)場(chǎng)LinearDiscriminantAnalysis/LDALinearmodelLinearRegressionLinkfunctionLocalMarkovpropertyLocalminimumLoglikelihoodLogodds/logitLogisticRegressionLog-likelihoodLog-linearregressionLong-ShortTermMemory/LSTMLossfunctionLetterMMachinetranslation/MTMacron-PMacron-RMajorityvotingManifoldassumptionManifoldlearningMargintheoryMarginaldistributionMarginalindependenceMarginalizationMarkovChainMonteCarlo/MCMC

MarkovRandomFieldMaximalcliqueMarkovRandomFieldMaximumLikelihoodEstimation/MLEMaximummarginMaximumweightedspanningtreeMax-PoolingMeansquarederrorMeta-learnerMetriclearningMicro-PMicro-RMinimalDescriptionLength/MDLMinimaxgameMisclassificationcostMixtureofexpertsMomentumMoralgraphMulti-classclassificationMulti-documentsummarizationMulti-layerfeedforwardneuralnetworksMultilayerPerceptron/MLPMultimodallearningMultipleDimensionalScalingMultiplelinearregression馬爾可夫隨機(jī)場(chǎng)最大團(tuán)極大似然估計(jì)/極大似然法最大間隔最大帶權(quán)生成樹最大池化均方誤差元學(xué)習(xí)器度量學(xué)習(xí)微查準(zhǔn)率微查全率最小描述長(zhǎng)度極小極大博弈誤分類成本混合專家動(dòng)量道德圖/端正圖多分類多文檔摘要馬爾可夫隨機(jī)場(chǎng)最大團(tuán)極大似然估計(jì)/極大似然法最大間隔最大帶權(quán)生成樹最大池化均方誤差元學(xué)習(xí)器度量學(xué)習(xí)微查準(zhǔn)率微查全率最小描述長(zhǎng)度極小極大博弈誤分類成本混合專家動(dòng)量道德圖/端正圖多分類多文檔摘要多層前饋神經(jīng)網(wǎng)絡(luò)多層感知器多模態(tài)學(xué)習(xí)多維縮放多元線性回歸互信息互信息樸素貝葉斯樸素貝葉斯分類器命名實(shí)體識(shí)別納什均衡自然語言生成自然語言處理負(fù)類負(fù)相關(guān)法負(fù)對(duì)數(shù)似然近鄰成分分析神經(jīng)機(jī)器翻譯神經(jīng)圖靈機(jī)牛頓法國(guó)際神經(jīng)信息處理系統(tǒng)會(huì)議沒有免費(fèi)的午餐定理噪音對(duì)比估計(jì)列名屬性非凸優(yōu)化非線性模型非度量距離非負(fù)矩陣分解無序?qū)傩苑秋柡筒┺姆稊?shù)歸一化MutualinformationLetterNNaivebayesNaiveBayesClassifierNamedentityrecognitionNashequilibriumNaturallanguagegeneration/NLGNaturallanguageprocessingNegativeclassNegativecorrelationNegativeLogLikelihoodNeighbourhoodComponentAnalysis/NCANeuralMachineTranslationNeuralTuringMachineNewtonmethodNIPSNoFreeLunchTheorem/NFLNoise-contrastiveestimationNominalattributeNon-convexoptimizationNonlinearmodelNon-metricdistanceNon-negativematrixfactorizationNon-ordinalattributeNon-SaturatingGameNormNormalization

Nuclearnorm核范數(shù)Numericalattribute數(shù)值屬性LetterOObjectivefunction目標(biāo)函數(shù)Obliquedecisiontree斜決策樹Occam'srazor奧卡姆剃刀Odds幾率Off-Policy離策略O(shè)neshotlearning一次性學(xué)習(xí)One-DependentEstimator/ODE獨(dú)依賴估計(jì)On-Policy在策略O(shè)rdinalattribute有序?qū)傩設(shè)ut-of-bagestimate包外估計(jì)Outputlayer輸出層Outputsmearing輸出調(diào)制法Overfitting過擬合/過配OversamplingLetterP過采樣Pairedt-test成對(duì)t檢驗(yàn)Pairwise成對(duì)型PairwiseMarkovproperty成對(duì)馬爾可夫性Parameter參數(shù)Parameterestimation參數(shù)估計(jì)Parametertuning調(diào)參Parsetree解析樹ParticleSwarmOptimization/PSO粒子群優(yōu)化算法Part-of-speechtagging詞性標(biāo)注Perceptron感知機(jī)

Performancemeasure性能度量PlugandPlayGenerativeNetwork即插即用生成網(wǎng)絡(luò)Pluralityvoting相對(duì)多數(shù)投票法Polaritydetection極性檢測(cè)Polynomialkernelfunction多項(xiàng)式核函數(shù)Pooling池化Positiveclass正類Positivedefinitematrix正定矩陣Post-hoctest后續(xù)檢驗(yàn)Post-pruning后剪枝potentialfunction勢(shì)函數(shù)Precision查準(zhǔn)率/準(zhǔn)確率Prepruning預(yù)剪枝Principalcomponentanalysis/PCA主成分分析Principleofmultipleexplanations多釋原則Prior先驗(yàn)ProbabilityGraphicalModel概率圖模型ProximalGradientDescent/PGD近端梯度下降Pruning剪枝Pseudo-label偽標(biāo)記LetterQQuantizedNeuralNetwork量子化神經(jīng)網(wǎng)絡(luò)Quantumcomputer量子計(jì)算機(jī)QuantumComputing量子計(jì)算QuasiNewtonmethodLetterR擬牛頓法RadialBasisFunction/RBF徑向基函數(shù)RandomForestAlgorithm隨機(jī)森林算法

RandomwalkRecallReceiverOperatingCharacteristic/ROCRectifiedLinearUnit/ReLURecurrentNeuralNetworkRecursiveneuralnetworkReferencemodelRegressionRegularizationReinforcementlearning/RLRepresentationlearningRepresentertheoremreproducingkernelHilbertspace/RKHSRe-samplingRescalingResidualMappingResidualNetworkRestrictedBoltzmannMachine/RBMRestrictedIsometryProperty/RIPRe-weightingRestrictedIsometryProperty/RIPRobustnessRootnodeRuleEngineRulelearning隨機(jī)漫步查全率/召回率受試者工作特征隨機(jī)漫步查全率/召回率受試者工作特征線性修正單元循環(huán)神經(jīng)網(wǎng)絡(luò)遞歸神經(jīng)網(wǎng)絡(luò)參考模型回歸正則化強(qiáng)化學(xué)習(xí)表征學(xué)習(xí)表示定理再生核希爾伯特空間重采樣法再縮放殘差映射殘差網(wǎng)絡(luò)受限玻爾茲曼機(jī)限定等距性重賦權(quán)法穩(wěn)健性/魯棒性根結(jié)點(diǎn)規(guī)則引擎規(guī)則學(xué)習(xí)Saddlepoint鞍點(diǎn)Samplespace樣本空間Sampling采樣Scorefunction評(píng)分函數(shù)Self-Driving自動(dòng)駕駛Self-OrganizingMap/SOM自組織映射Semi-naiveBayesclassifiers半樸素貝葉斯分類器Semi-SupervisedLearning半監(jiān)督學(xué)習(xí)semi-SupervisedSupportVector半監(jiān)督支持向量機(jī)MachineSentimentanalysis情感分析Separatinghyperplane分離超平面SigmoidfunctionSigmoid函數(shù)Similaritymeasure相似度度量Simulatedannealing模擬退火Simultaneouslocalizationand同步定位與地圖構(gòu)建mappingSingularValueDecomposition奇異值分解Slackvariables松弛變量Smoothing平滑Softmargin軟間隔Softmarginmaximization軟間隔最大化Softvoting軟投票Sparserepresentation稀疏表征Sparsity稀疏性Specialization特化SpectralClustering譜聚類SpeechRecognition語音識(shí)別

Splittingvariable切分變量Squashingfunction擠壓函數(shù)Stability-plasticitydilemma可塑性-穩(wěn)定性困境Statisticallearning統(tǒng)計(jì)學(xué)習(xí)Statusfeaturefunction狀態(tài)特征函Stochasticgradientdescent隨機(jī)梯度下降Stratifiedsampling分層采樣Structuralrisk結(jié)構(gòu)風(fēng)險(xiǎn)Structuralriskminimization/SRM結(jié)構(gòu)風(fēng)險(xiǎn)最小化Subspace子空間Supervisedlearning監(jiān)督學(xué)習(xí)/有導(dǎo)師學(xué)習(xí)supportvectorexpansion支持向量展式SupportVectorMachine/SVM支持向量機(jī)Surrogatloss替代損失Surrogatefunction替代函數(shù)Symboliclearning符號(hào)學(xué)習(xí)Symbolism符號(hào)主義SynsetLetterT同義詞集T-DistributionStochasticNeighbourT-分布隨機(jī)近鄰嵌入Embedding/t-SNETensor張量TensorProcessingUnits/TPU張量處理單元Theleastsquaremethod最小二乘法Threshold閾值Thresholdlogicunit閾值邏輯單元Threshold-moving閾值移動(dòng)TimeStep時(shí)間步驟標(biāo)記化標(biāo)記化訓(xùn)練誤差訓(xùn)練示例/訓(xùn)練例直推學(xué)習(xí)遷移學(xué)習(xí)樹庫(kù)試錯(cuò)法真負(fù)類真正類真正例率圖靈機(jī)二次學(xué)習(xí)欠擬合/欠配欠采樣可理解性非均等代價(jià)單位階躍函數(shù)單變量決策樹無監(jiān)督學(xué)習(xí)/無導(dǎo)師學(xué)習(xí)無監(jiān)督逐層訓(xùn)練上采樣梯度消失問題變分推斷VC維理論版本空間維特比算法TokenizationFT1??TrainingerrorTraininginstanceTransductivelearningTransferlearningTreebankTria-by-errorTruenegativeTruepositiveTruePositiveRate/TPRTuringMachineTwice-learningLetterUUnderfittingUndersamplingUnderstandabilityUnequalcostUnit-stepfunctionUnivariatedecisiontreeUnsupervisedlearningUnsupervisedlayer-wisetrainingUpsamplingLetterVVanishingGradientProblemVariationalinferenceVCTheoryVersionspaceViterbialgorithmVonNeumannarchitectureVonNeumannarchitecture馮?諾伊曼架構(gòu)LetterWWassersteinGAN/WGANWeaklearnerWeightWeightsharingWeightedvotingWithin-classscattermatrixWordembeddingWordsensedisambiguationLetterZWasserstein生成對(duì)抗網(wǎng)絡(luò)弱學(xué)習(xí)器權(quán)重權(quán)共享加權(quán)投票法類內(nèi)散度矩陣詞嵌入詞義消歧Zero-datalearning零數(shù)據(jù)學(xué)習(xí)Zero-shotlearning零次學(xué)習(xí)模式識(shí)別計(jì)算機(jī)硬件的迅速發(fā)展,計(jì)算機(jī)應(yīng)用領(lǐng)域的不斷開拓,急切地要求計(jì)算機(jī)能更有效地感知諸如聲音、文字圖象、溫度、震動(dòng)等等信息資料模式識(shí)別便得到迅速發(fā)展。模式"(Pattern)一詞的本意是指〃模式"(Pattern)一詞的本意是指完美無缺的供模仿的一些標(biāo)本。模式識(shí)別就是指識(shí)別出給定物體所模仿的標(biāo)本。人工智能所研究的模式識(shí)別是指用計(jì)算機(jī)代替人類或幫助人類感知模式,是對(duì)人類感知外界功能的模擬,研究的是計(jì)算機(jī)模式識(shí)別系統(tǒng),也就是使一個(gè)計(jì)算機(jī)系統(tǒng)具有模擬人類通過感官接受外界信息、識(shí)別和理解周圍環(huán)境的感知能力。模式識(shí)別是一個(gè)不斷發(fā)展的新學(xué)科,它的理論基礎(chǔ)和研究范圍也在不斷發(fā)展。隨著生物醫(yī)學(xué)對(duì)人類大腦的初步認(rèn)識(shí),模擬人腦構(gòu)造的計(jì)算機(jī)實(shí)驗(yàn)即人工神經(jīng)網(wǎng)絡(luò)方法早在50年代末、60年代初就已經(jīng)開始。至今,在模式識(shí)別領(lǐng)域,神經(jīng)網(wǎng)絡(luò)方法已經(jīng)成功地用于手寫字符的識(shí)別、汽車牌照的識(shí)別、指紋識(shí)別、語音識(shí)別等方面。目前模式識(shí)別學(xué)科正處于大發(fā)展的階段,隨著應(yīng)用范圍的不斷擴(kuò)大,隨著計(jì)算機(jī)科學(xué)的不斷進(jìn)步,基于人工神經(jīng)網(wǎng)絡(luò)的模式識(shí)別技術(shù),在90年代將有更大的發(fā)展。機(jī)器視覺機(jī)器視覺或計(jì)算機(jī)視覺已從模式識(shí)別的一個(gè)研究領(lǐng)

溫馨提示

  • 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請(qǐng)下載最新的WinRAR軟件解壓。
  • 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請(qǐng)聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
  • 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁(yè)內(nèi)容里面會(huì)有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
  • 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
  • 5. 人人文庫(kù)網(wǎng)僅提供信息存儲(chǔ)空間,僅對(duì)用戶上傳內(nèi)容的表現(xiàn)方式做保護(hù)處理,對(duì)用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對(duì)任何下載內(nèi)容負(fù)責(zé)。
  • 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請(qǐng)與我們聯(lián)系,我們立即糾正。
  • 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時(shí)也不承擔(dān)用戶因使用這些下載資源對(duì)自己和他人造成任何形式的傷害或損失。

評(píng)論

0/150

提交評(píng)論