Topic detection is a process used to analyze words in a collection of textual data to determine the topics in the collection, how they relate to each other, and how these topics change from time to time. One of recent topic detection methods is Separable Nonnegative Matrix Factorization (SNMF) which uses the direct method to solve nonnegative matrix factorization using separable assumption. There are three stages in the SNMF method, which are, generating a word co-occurrence matrix, determining anchor words, and recover to get the matrix of word-topics. In this paper, we examine a latent semantics-based method to determine the anchor words for each topics. Our simulation shows that both latent semantic-based methods reach coherence scores comparable to the standard method; however, more efficient in running time.