2005年3月8日

Identical mapping in ANN

This evening, I had a class on Artificial Neural Network. At this time, Prof. Haifeng Li introduced some nice practical application of EBP. The most exciting idea was about the Identical Mapping in ANN and its applications.


The main idea about identical mapping was as follows:
For example, you had 8 input nodes, 8 output nodes, and log(8)=3 implied nodes. You could use some data vectors, such as (a1,a2,...,a8) for input. The training data format is ((a1,a2,...,a8),(a1,a2,...,a8)). Just as figure 1.

Identical Mapping
Figure 1. Identical Mapping

After the training, the input layer had the ability of changing the format of the input data. So you could use the input for some application, i.e. such as data compression. And then for decompression, you could use the output layer.

I was excited by such net structure. Since there was a very important application for us. That was feature selection. Yes. You could construct the original full feature set. And then use the identical mapping network to choose the best transferred features on the implicit layer. It was a nice idea.

I believed there was another useful application for identical mapping network. That was data encrypt and decode.

I could try it in my research!

没有评论: