cool hit counter 16s analysis of LEfSe analysis_Intefrankly

16s analysis of LEfSe analysis


Suddenly, one day, my brother said to me, "Have you done the variance analysis after so long? Uh, variance analysis? Haven't we done this before? I started library(edgeR); not this, that LEFSe! Well, let's do LEfSe then! What is LEfSe? Another learning process, shared with.

First I went to the LEFSe website at

Wow, this site is more than LEfSe, it's like a ngs data analysis pinterest but we are focusing on LEfSe today, open.

First select the import file, it says, upload your tab-separated file, the file contains relative abundance and gateway information (perhaps with subcategories and sub-tabs), reading this it is of course not very clear on the format of the imported data: the

What form of data is needed? I scrolled down and on the following page I saw this.

First the relative percent abundance used for all data

First line: is a set of labels

Second row: also a set of labels

Third row: each sample is numbered

The fourth row: is the relative abundance of total bacteria

Fifth row: is the relative abundance of the phylum Actinomycetes

Row 9: is the relative abundance of Clostridiales

By this point, I seem to understand the form of this input file: the

The grouping information and sample information are located in the first few lines, and the data section is a merging of the relative abundances of all cells under the six phylum classes, with the relative abundances of these different classes put together and separated by tabs: here I am reminded of a command from Qiime, and the following additional data run of lefse data.

#this time I used a txt formatted otu_table run from usearch10, converted to a biom file:.

biom convert -i otu_table.txt-o otu_table.biom--table-type="OTUtable" --to-json

#Statistical series, just to make sure there are no problems.

biom summarize-table -i otu_table.biom

# Relative abundance of sub-category statistics.

summarize_taxa_through_plots.py -otaxa_summary -i otu_table.biom -m map_lxdjhg_ys.txt

Run the interface, error, what's the reason? It was found that there were no species annotations added after the biom file.

#Add species information to the last column of the OTU table, named taxonomy

# Relative abundance of sub-category statistics.

summarize_taxa_through_plots.py -otaxa_summary -i otu_table_tax.biom -m map_lxdjhg_ys.txt

# It's running out quickly: #

Now start to organize the table, using Excel (R language to organize the table is not off, we laugh) this process does waste some time, a total of six tables each by selecting the replacement command to gradually modify the unwanted characters into the style of analysis requirements, may take ten minutes it finally organized into a whole file txt.

Start uploading.

Select the local file, choose Auto_detect for the type to upload click Start.

Upon success, the following documents appear on the right side of the page.

Starting step A.

Beginning Step B.

Begin step C. Begin graphing.

Beginning step D.

Step E: Species requiring selection of differences.

Step F Histograms for all discrepant species.

The end result is these.

Learning never ends and sharing never stops!

Write it on the back.

To this end, someone spoke Qiime out of the biom file and the comment file and quickly adapted it for analysis via the pyhton script, referenced at (https://github.com/twbattaglia/koeken; https://pypi.python.org/pypi/pannenkoek/0.1.5), both software can be used, unfortunately, I can not install either one, is based on Qiime1, there is no other linux system, it gave up, if you install the on must leave a message Oh. Of course being a noob I managed to mess up Qiime after a full day of trying and have uninstalled it.


Recommended>>
1、How to protect yourself against hackers manhunting
2、jquerymobile mobile web4
3、Open source PaaS is not that easy to use
4、WebIcon123 Calling of icons within a web page
5、Asterisk password view

    已推荐到看一看 和朋友分享想法
    最多200字,当前共 发送

    已发送

    朋友将在看一看看到

    确定
    分享你的想法...
    取消

    分享想法到看一看

    确定
    最多200字,当前共

    发送中

    网络异常,请稍后重试

    微信扫一扫
    关注该公众号