考研院校库 > 北京师范大学 > 考研大纲 > 正文

2017年北京师范大学755神经计算科学硕士研究生考试大纲

作者:聚创北师大考研网-a老师 点击量: 819 发布时间: 2016-12-01 11:19 【微信号:扫码加咨询】

  据悉,2017年北京师范大学755神经计算科学硕士研究生考试大纲已公布,聚英考研信息网为大家整理如下:

  推荐阅读:

  北京师范大学2017年双证硕士研究生招生简章

  北京师范大学2017年双证专业学位硕士研究生招生简章一览表

  2017年北京师范大学硕士研究生招生考试大纲

  755神经计算科学

  第一部分 考试说明

  一、考试性质

  本《神经计算科学》考试大纲适用于北京师范大学脑与认知科学学院各专业的硕士研究生入学考试。要求考生全面系统地掌握计算神经科学的基本概念和研究方法,能熟练运用计算神经科学知识分析神经生物学基本问题。  考试对象为报考我校硕士研究生入学考试的准考考生。

 二、考试形式与试卷

  (一)答卷方式:闭卷、笔试

  (二)答题时间:180分钟

  (三)各题型分值(总分300)

  1. 名词解释30

  2.简答题  90

  3.论述题  180

  三、考试要求

  1.掌握计算神经科学的基本概念和基础理论

  2.了解计算神经科学的一些经典进展

  3.具有运用基本概念和基础理论分析问题与解决问题的能力

 第二部分 考察要点

  PART I - ANALYZING AND MODELING NEURAL RESPONSES

  Chapter 1 - Neural Encoding I: Firing Rates and Spike Statistics

  1. Properties of Neurons;Recording Neuronal Responses;From Stimulus to Response

  2. Spike Trains and Firing Rates; Tuning Curves;Spike-Count Variability

  3. Describing the Stimulus;The Spike-Triggered Average;White-Noise Stimuli;Multiple-Spike-Triggered Averages and Spike-Triggered Correlations

  4. Spike Train Statistics; The Homogeneous Poisson Process;The Spike-Train Autocorrelation Function;The Inhomogeneous Poisson Process;The Poisson Spike Generator;Comparison with Data

  5. The Neural Code; Independent-Spike, Independent Neuron and Correlation Codes;Temporal Codes

  Chapter 2 - Neural Encoding II: Reverse Correlation and Receptive Fields

  1. Estimating Firing Rates; The Most Effective Stimulus; Static Nonlinearities

  2. Early Visual System; The Retinotopic Map; Visual Stimuli; The Nyquist Frequency

  3. Reverse Correlation Methods - Simple Cells;

  Spatial Receptive Fields; Temporal Receptive Fields; Response of a Simple Cell to a Counterphase Grating;  Space-Time Receptive Fields; Nonseparable Receptive Fields; Static Nonlinearities - Simple Cells

  4. Static Nonlinearities - Complex Cells

  5. Receptive Fields in the Retina and LGN

  6. Constructing V1 Receptive Fields

  Chapter 3 - Neural Decoding

  1. Encoding and Decoding

  2. Discrimination; ROC Curves; ROC Analysis of Motion Discrimination; The Likelihood Ratio Test

  3. Population Decoding; Encoding and Decoding Direction; Optimal Decoding Methods; Fisher Information; Optimal Discrimination

  4. Spike Train Decoding

  PART II - MODELING NEURONS AND NETWORKS

  Chapter 4 - Model Neurons I: Neuroelectronics

  1. Levels of Neuron Modeling

  2. Electrical Properties of Neurons

  Intracellular Resistance; Membrane Capacitance and Resistance; Equilibrium and Reversal Potentials; The Membrane Current

  3. Single-Compartment Models

  Integrate-and-Fire Models; Spike-Rate Adaptation and Refractoriness

  4. Voltage-Dependent Conductances

  Persistent Conductances; Transient Conductances; Hyperpolarization-Activated Conductances

  5. The Hodgkin-Huxley Model

  6. Modeling Channels

  7. Synaptic Conductances

  The Postsynaptic Conductance; Release Probability and Short-Term Plasticity

  8. Synapses on Integrate-and-Fire Neurons

  Regular and Irregular Firing Modes

  Chapter 5 - Model Neurons II: Conductances and Morphology

  1. Levels of Neuron Modeling

  2. Conductance-Based Models

  The Connor-Stevens Model; Postinhibitory Rebound and Bursting

  3. The Cable Equation

  Linear Cable Theory (An Infinite Cable; An Isolated Branching Node); The Rall Model; The Morphoelectrotonic Transform

  4. Multi-Compartment Models

  Action Potential Propagation Along an Unmyelinated Axon; Propagation Along a Myelinated Axon

  Chapter 6 - Network Models

  1. Firing-Rate Models

  Feedforward and Recurrent Networks; Continuously Labelled Networks

  2. Feedforward Networks

  Neural Coordinate Transformations

  3. Recurrent Networks

  Linear Recurrent Networks (Selective Amplification; Input Integration; Continuous Linear Recurrent Networks);

  Nonlinear Recurrent Networks (Nonlinear Amplification; A Recurrent Model of Simple Cells in Primary Visual Cortex; A Recurrent Model of Complex Cells in Primary Visual Cortex; Winner-Take-All Input Selection; Gain Modulation; Sustained Activity; Maximum Likelihood and Network Recoding )

  4. Network Stability

  Associative Memory

  5. Excitatory-Inhibitory Networks

  Homogeneous Excitatory and Inhibitory Populations (Phase-Plane Methods and Stability Analysis); The Olfactory Bulb; Oscillatory Amplification

  6. Stochastic Networks

  PART III - PLASTICITY AND LEARNING

  Chapter 7 - Plasticity and Learning

  1. Stability and Competition

  2. Synaptic Plasticity Rules

  The Basic Hebb Rule; The Covariance Rule; The BCM Rule; Synaptic Normalization (Subtractive Normalization; Multiplicative Normalization and the Oja Rule); Timing-Based Rules

  3. Unsupervised Learning

  Single Postsynaptic Neuron (Principal Component Projection; Hebbian Development and Ocular Dominance; Hebbian Development of Orientation Selectivity; Temproal Hebbian Rules and Trace Learning)

  Multiple Postsynaptic Neurons (Fixed Linear Recurrent Connections; Competitive Hebbian Learning; Feature-Based Models; Anti-Hebbian Modification; Timing-Based Plasticity and Prediction)

  4. Supervised Learning

  Supervised Hebbian Learning (Classification and the Perceptron; Function Approximation)

  5. Supervised Error-Correcting Rules

  The Perceptron Learning Rule

  6. The Delta Rule

  Contrastive Hebbian Learning

 


以上是聚创考研网为考生整理的"2017年北京师范大学755神经计算科学硕士研究生考试大纲"的相关考研信息,希望对大家考研备考有所帮助! 备考过程中如有疑问,也可以添加老师微信H17720740258进行咨询。

分享:
学习QQ群
MORE
浏览过该网页的还看了 MORE
聚创考研网校微信 聚创考研网校微信