Ce serveur Gitlab sera éteint le 30 juin 2020, pensez à migrer vos projets vers les serveurs gitlab-research.centralesupelec.fr et gitlab-student.centralesupelec.fr !

Commit aafd40b5 authored by Benoît Bayol's avatar Benoît Bayol

initial commit

parents

Too many changes to show.

To preserve performance only 1000 of 1000+ files are displayed.

cmake_minimum_required(VERSION 2.6)
file(GLOB src source/*.cpp)
add_library(opennn ${src})
file(GLOB tinyxml2_src tinyxml2/*.cpp)
add_library(tinyxml2 ${tinyxml2_src})
add_executable(simple_function_regression examples/simple_function_regression/main.cpp)
target_link_libraries(simple_function_regression opennn)
target_link_libraries(simple_function_regression tinyxml2)
\section*{The unit testing development pattern}
Unit testing is the process of creating integrated tests into a source code, and running those tests every time it is to be built. In that way, the build process checks not only for syntax errors, but for semantic errors as well.
In that regard, unit testing is generally considered a development pattern, in which the tests would be written even before the actual code. If tests are written first, they:
\begin{itemize}
\item[-] Describe what the code is supposed to do in concrete, verifiable terms.
\item[-] Provide examples of code use rather than just academic descriptions.
\item[-] Provide a way to verify when the code is finished (when all the tests run correctly).
\end{itemize}
\section*{Related code}
There exist several available frameworks for incorporating test cases in C++ code, such as CppUnit or Cpp test.
However, for portability reasons, \texttt{OpenNN} comes with a simple unit testing utility class for handing automated tests.
Also, every classes and methods have test classes and methods associated.
\subsubsection*{The UnitTesting class in OpenNN}
\texttt{OpenNN} includes the \lstinline"UnitTesting" abstract class to provide some simple mechanisms to build test cases and test suites.
\subsubsection*{Constructor}
Unit testing is to be performed on classes and methods. Therefore the \lstinline"UnitTesting" class is abstract and it can't be instantiated. Concrete test classes must be derived here.
\subsubsection*{Members}
The \lstinline"UnitTesting" class has the following members:
\begin{itemize}
\item[-] The counted number of tests.
\item[-] The counted number of passed tests.
\item[-] The counted number of failed tests.
\item[-] The output message.
\end{itemize}
That members can be accessed or modified using get and set methods, respectively.
\subsubsection*{Methods}
Derived classes must implement the pure virtual \lstinline"run_test_case" method, which includes all testing methods. The use of this method is as follows:
\begin{lstlisting}
TestMockClass tmc;
tmc.run_test_case();
\end{lstlisting}
The \lstinline"assert_true" and \lstinline"assert_false" methods are used to prove if some condition is satisfied or not, respectively. If the result is correct, the counter of passed tests is increased by one; otherwise the counter of failed tests is increased by one,
\begin{lstlisting}
unsigned int a = 0;
unsigned int b = 0;
TestMockClass tmc;
tmc.assert_true(a == b, "Increase tests passed count");
tmc.assert_false(a == b, "Increase tests failed count");
\end{lstlisting}
Finally, the \lstinline"print_results" method prints the testing outcome,
\begin{lstlisting}
TestMockClass tmc;
tmc.run_test_case();
tmc.print_results();
\end{lstlisting}
\subsubsection*{The unit testing classes}
Every single class in \texttt{OpenNN} has a test class associated, and every single method of that class has also a test method associated.
On the other hand, a test suite of all the classes distributed within \texttt{OpenNN} can be found in the folder \lstinline"AllTests".
This diff is collapsed.
[x,y]=meshgrid(-1:0.1:1,-1:0.1:1);
surf(x,y,x.^2+y.^2)
xlabel('\zeta_{1}')
ylabel('\zeta_{2}')
zlabel('f(\zeta_{1},\zeta_{2})')
\texttt{Open} provides a workaround for function optimization problem.
It also includes some benchmark problems in function optimization.
\index{global minimum condition}
\index{minimal argument}
\index{minimization}
\index{maximal argument}
\index{maximization}
\index{global minimum}
\index{local minimum}
\index{unimodal function}
\index{multimodal function}
\index{local minimum condition}
The variational problem is formulated in terms of finding a function
which is an extremal argument of some performance functional. On the
other hand, the function optimization problem is formulated in terms
of finding a vector which is an extremal argument of some performance
function.
While neural networks naturally leads to the solution of
variational problems, \texttt{OpenNN} provides a workaround for function
optimization problems by means of the independent parameters.
Function optimization refers to the study of problems in which the
aim is to minimize or maximize a real function. In this way, the
performance function defines the optimization problem itself.
The formulation of a function optimization problem requires:
\begin{itemize}
\item[-] A neural network.
\item[-] A performance functional.
\item[-] A training strategy.
\end{itemize}
\subsection*{Neural network}
\index{unconstrained function optimization problem}
\index{number of variables}
\index{domain, objective function}
\index{image, objective function}
The independent parameters of a neural network spans a vector space to represent the possible solutions of a function optimization problem.
\subsection*{Performance functional}
The function to be optimized is called the performance function. The
domain of the objective function for a function optimization problem
is the set of independent parameteres, and the image of that function
is the set of real numbers. The number of variables in the objective function
is the number of independent parameters.
A function optimization problem can be specified by a set of
constraints, which are equalities or inequalities that the solution
must satisfy. Such constraints are expressed as functions.
Thus, the
constrained function optimization problem can be formulated so as to find a vector
such that the constrainst functions are zero
and for which the performance function takes on a minimum value.
In other words, the constrained function optimization problem
consists of finding an argument which makes all the constraints to
be satisfied and the objective function to be an extremum. The
integer $l$ is known as the number of constraints in the function
optimization problem.
\subsection*{Training strategy}
The training strategy is the solving strategy
for the optimization problem.
If possible, the quasi-Newton method should be applied here.
If that fails, the evolutionary algorithm can be used.
This section describes a number of test functions for
optimization. That functions are taken from the literature on both
local and global optimization.
%\subsection*{The De Jong's function}
%One of the simplest test functions for optimization is the De Jong's
%function, which is an unconstrained and unimodal function. The De
%Jong's function optimization problem in $d$ variables can be stated
%as:
%The De Jong's function has got a unique minimal argument
%$\boldsymbol\zeta^{*} = (0, \ldots, 0 )$, which gives a minimum value
%$f(\boldsymbol\zeta^{*}) = 0$. Figure \ref{DeJongFunction} is a plot of
%that function in $2$ variables.
%\begin{figure}[h!]
%\begin{center}
%\includegraphics[width=0.75\textwidth]{function_optimization/de_jong_function}
%\caption{The De Jong's Function in $2$ variables.}\label{DeJongFunction}
%\end{center}
%\end{figure}
%The gradient vector for the De Jong's function is given by and the Hessian matrix by
\subsection*{The Rosenbrock's function}
The Rosenbrock's function, also known as banana function, is an
unconstrained and unimodal function. The optimum is inside a long,
narrow, parabolic shaped flat valley. Convergence to that optimum is
difficult and hence this problem has been repeatedly used in assess
the performance of optimization algorithms. The Rosenbrock's
function optimization problem in $d$ variables can be stated as:
\begin{figure}[h!]
\begin{center}
\includegraphics[width=0.75\textwidth]{function_optimization/rosenbrock_function.png}
\caption{The Rosenbrock's function in $2$
variables.}\label{RosenbrockFunction}
\end{center}
\end{figure}
The minimal argument of the Rosenbrock's function is found at
$(1, \ldots, 1)$. The minimum value of that
function is $= 0$. Figure \ref{RosenbrockFunction}
is a plot of the Rosenbrock's function in $2$ variables.
\subsection*{The Rastrigin's function}
The Rastrigin's function is based on the De Jong's function with the
addition of cosine modulation to produce many local minima. As a
result, this function is highly multimodal. However, the location of
the minima are regularly distributed. The Rastrigin's function
optimization problem in $d$ variables can be stated as:
\begin{figure}[h!]
\begin{center}
\includegraphics[width=0.75\textwidth]{function_optimization/rastrigin_function.png}
\caption{The Rastrigin's function in $2$
variables.}\label{RastriginFunction}