-
Notifications
You must be signed in to change notification settings - Fork 0
/
nn.html
182 lines (167 loc) · 12.6 KB
/
nn.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
<!DOCTYPE html>
<!--[if lt IE 7]> <html class="no-js lt-ie9 lt-ie8 lt-ie7"> <![endif]-->
<!--[if IE 7]> <html class="no-js lt-ie9 lt-ie8"> <![endif]-->
<!--[if IE 8]> <html class="no-js lt-ie9"> <![endif]-->
<!--[if gt IE 8]><!--> <html class="no-js"> <!--<![endif]-->
<head>
<meta charset="utf-8">
<title>Neural Network</title>
<meta name="description" content="">
<meta name="viewport" content="width=device-width">
<link rel="stylesheet" href="http://fonts.googleapis.com/css?family=Roboto+Slab:400,700,300,100">
<link rel="stylesheet" href="http://fonts.googleapis.com/css?family=Roboto:400,400italic,300italic,300,500,500italic,700,900">
<!--
Artcore Template
http://www.templatemo.com/preview/templatemo_423_artcore
-->
<link rel="stylesheet" href="css/bootstrap.css">
<link rel="stylesheet" href="css/font-awesome.css">
<link rel="stylesheet" href="css/animate.css">
<link rel="stylesheet" href="css/templatemo-misc.css">
<link rel="stylesheet" href="css/templatemo-style.css">
<script src="js/vendor/modernizr-2.6.1-respond-1.1.0.min.js"></script>
</head>
<body>
<!--[if lt IE 7]>
<p class="chromeframe">You are using an outdated browser. <a href="http://browsehappy.com/">Upgrade your browser today</a> or <a href="http://www.google.com/chromeframe/?redirect=true">install Google Chrome Frame</a> to better experience this site.</p>
<![endif]-->
<section id="pageloader">
<div class="loader-item fa fa-spin colored-border"></div>
</section> <!-- /#pageloader -->
<header class="site-header container-fluid">
<div class="top-header">
<div class="logo col-md-6 col-sm-6">
<h1><a href="index.html"><em>Sasi </em>Bonu</a></h1>
<span> </span>
</div> <!-- /.logo -->
<div class="social-top col-md-6 col-sm-6">
<ul>
<li><a href="https://github.com/sasibonu" class="fa fa-github"></a></li>
<li><a href="https://www.linkedin.com/in/sasi-bonu-98878116a/" class="fa fa-linkedin"></a></li>
</ul>
</div> <!-- /.social-top -->
</div> <!-- /.top-header -->
<div class="main-header">
<div class="row">
<div class="main-header-left col-md-3 col-sm-6 col-xs-8">
<a id="search-icon" class="btn-left fa fa-search" href="#search-overlay"></a>
<div id="search-overlay">
<a href="#search-overlay" class="close-search"><i class="fa fa-times-circle"></i></a>
<div class="search-form-holder">
<h2>Type keywords and hit enter</h2>
<form id="search-form" action="#">
<input type="search" name="s" placeholder="" autocomplete="off" />
</form>
</div>
</div><!-- #search-overlay -->
</div> <!-- /.main-header-left -->
<div class="menu-wrapper col-md-9 col-sm-6 col-xs-4">
<a href="#" class="toggle-menu visible-sm visible-xs"><i class="fa fa-bars"></i></a>
<ul class="sf-menu hidden-xs hidden-sm">
<li><a href="index.html">Introduction</a></li>
<li><a href="eda.html">EDA</a></li>
<li class="active"><a href="#">Projects</a>
<ul>
<li><a href="projects-2.html">Two Columns</a></li>
<li><a href="projects-3.html">Three Columns</a></li>
<li><a href="project-details.html">Project Single</a></li>
</ul>
</li>
<li><a href="conclusion.html">Conclusion</a></li>
<li><a href="contact.html">Contact</a></li>
</ul>
</div> <!-- /.menu-wrapper -->
</div> <!-- /.row -->
</div> <!-- /.main-header -->
<div id="responsive-menu">
<ul>
<li><a href="index.html">Introduction</a></li>
<li><a href="eda.html">EDA</a></li>
<li><a href="#">Projects</a>
<ul>
<li><a href="projects-2.html">Two Columns</a></li>
<li><a href="projects-3.html">Three Columns</a></li>
<li><a href="project-details.html">Project Single</a></li>
</ul>
</li>
<li><a href="conclusion.html">Conclusion</a></li>
<li><a href="contact.html">Contact</a></li>
</ul>
</div>
</header> <!-- /.site-header -->
<div class="content-wrapper">
<div class="inner-container container">
<div class="row">
<div class="section-header col-md-12">
<h2>Neural Network</h2>
<span> </span>
</div> <!-- /.section-header -->
</div> <!-- /.row -->
<div class="project-detail row"></div>
<div class="project-infos col-md-12">
<div class="box-content">
<h2 class="project-title">Neural Network</h2>
<span class="project-subtitle">Supervised Learning</span>
<p>
A class of machine learning models called neural networks is modeled after the composition and operations of the human brain. They are made up of layers of networked nodes, or neurons.
Weighted connections allow information to flow from input to output layers of the network, and each neuron generates an output by applying an activation function to its weighted inputs.
Weights are multiplied to the inputs, whereas the biases are added to the inputs before feeding them to activation functions.
The weights of the connections are changed throughout a process known as training, where the model gains knowledge from labeled data in order to decrease the discrepancy between the true labels and the anticipated outputs.
As a result, neural networks are able to carry out tasks like pattern recognition, regression, and classification by learning intricate patterns and relationships in data.
</p>
<div class="project-slider col-md-12">
<img src="images/projects/nn.png" alt="Slide 1">
<img src="images/projects/nn.png" alt="Slide 2">
</div> <!-- /.project-slider -->
<p>
Weights and biases are optimized through backpropagation. Through backpropagation, the model is able to modify its weights and biases in order to minimize the loss by computing the gradient of the loss function with respect to the network's weights and biases.
The calculus chain rule is used to calculate the gradients of the loss function with regard to the network weights. Layer by layer, the gradients are propagated backwards through the network from the input layer to the output layer.
This makes it possible for the model to determine how each weight affects the total loss and how to change it to reduce the loss. The usual practice is to use backpropagation in conjunction with an optimization technique, like gradient descent, to update the network's weights in a way that minimizes the loss function.
The model converges to a set of weights that produce the least amount of loss on the training set of data after this iterative process is carried out over a number of iterations, or epochs.
In a neural network, activation functions are mathematical functions that are applied to each neuron's output. They give the network non-linearity, which enables it to recognize intricate patterns in the data. Sigmoid, tanh, ReLU (Rectified Linear Unit), and softmax are examples of common activation functions.
Each has a unique role, such as applying non-linear transformations, handling a range of input values, or generating probabilities for multi-class classification. By capturing intricate interactions between features, these functions aid in enhancing the model's functionality.
</p>
<strong>Long Short-Term Memory</strong>
<p>
Recurrent neural networks (RNNs) with Long Short-Term Memory (LSTM) architecture are intended to capture long-term dependencies in sequential data and solve the vanishing gradient issue that regular RNNs have. By using three gates—an input gate, an output gate, and a forget gate—it presents a memory cell that can retain data over time.
LSTM networks may selectively recall or forget information over extended sequences thanks to these gates, which control the flow of information into and out of the memory cell. This makes them very useful for sequential data problems like speech recognition, time series prediction, and natural language processing.
</p>
<div class="project-slider col-md-12">
<img src="images/projects/lstm.png" alt="Slide 1">
<img src="images/projects/lstm.png" alt="Slide 2">
</div> <!-- /.project-slider -->
<p>
<strong> Data Preparation </strong>
Labeled data is necessary for Neural Networks. This implies that a label identifying the class or category of each data point is required. To evaluate how well the trained model generalizes to new data, data is divided into training and testing sets.
Training Set: The SVM model's training or construction set. It makes up the bulk of the information.
Testing Set: Applied to assess how well the trained model performs. It needs to be kept apart from the training set in order to provide an objective evaluation of the model's accuracy.
</p>
<div class="project-slider col-md-12">
<img src="images/projects/data.png" alt="Slide 1">
<img src="images/projects/data.png" alt="Slide 2">
</div> <!-- /.project-slider -->
<p><a href="https://github.com/sasibonu/sasibonu.github.io/blob/main/SasiBonuA05.ipynb">Code Link</a></p>
<ul class="project-meta">
<li><i class="fa fa-calendar-o"></i>02 May 2024</li>
</ul>
</div> <!-- /.box-content -->
</div> <!-- /.project-infos -->
</div> <!-- /.project-detail -->
</div> <!-- /.inner-content -->
</div> <!-- /.content-wrapper -->
<script src="js/vendor/jquery-1.11.0.min.js"></script>
<script>window.jQuery || document.write('<script src="js/vendor/jquery-1.11.0.min.js"><\/script>')</script>
<script src="js/plugins.js"></script>
<script src="js/main.js"></script>
<!-- Preloader -->
<script type="text/javascript">
//<![CDATA[
$(window).load(function() { // makes sure the whole site is loaded
$('.loader-item').fadeOut(); // will first fade out the loading animation
$('#pageloader').delay(350).fadeOut('slow'); // will fade out the white DIV that covers the website.
$('body').delay(350).css({'overflow-y':'visible'});
})
//]]>
</script>
</body>
</html>