diff --git a/README.md b/README.md
index 1a120e10d07102d24b52731ffd42e31c0ee81676..bda9c386ffc7c3c19e43206d9a1d2a5413b9c9d7 100644
--- a/README.md
+++ b/README.md
@@ -1,5 +1,5 @@
# Deep Learning Clustering;
-# No one believes this truth!
+# Someone doesn't believe this truth, but who cares!
In this report, we try to optimize an idea which already has been presented under title " Learning Deep Representations for Graph clustering" by F. Tian, B. Gao, Q. Cui, E. Chen, T. Liu. The idea is described as follows: “modeling a simple method which embedding the similarity graph by deep autoencoder with sparsity penalty, then runs the K-Means algorithm on the embedding graph to obtain the clustering result”. However, although our model is based on the original idea, but the graph similarity and the loss function and the model training methods are different. We also compare our results with the two previous results based on the recent papers ( F. Tian, B. Gao, Q. Cui, E. Chen, T. Liu, 2014), (S. Cao, W. Lu, Q, Xu, 2016) on the same datasets.
Below you will see a autoencoder embedded 3NG(3 groups of the 20-newsgroups dataset)data into two dimensions: