Weekly | Instead of obsessing over AI, why not "touch" these open source platforms?


Since the founding of Open AI, companies such as Google, Facebook, Microsoft, Twitter, and others have "half-heartedly" embarked on open source platforms to attract researchers from academia and industry to share their research. Whether what these giants are doing (sharing software and hardware designs publicly) is an initial move to accelerate the overall progress of the AI industry or a forced move out of response to competitors, it's a good thing for those who are enamored with AI.

It's like when you're at a loss for AI, these open source platforms can't help but jump into your eye.

Hands-on with Google TensorFlow on xHamster

Why did you choose Tensorflow as the platform of choice in the first place?

At first there was uncertainty about which degree learning platform to choose, and Tensorflow was not yet available. The main considerations at the time were the maturity of the platform, the programming languages supported, the GPU support and efficiency, the ease of use for building neural networks, the ease of getting started, the subsequent development of the platform, the development ecosystem, the efficiency of the platform, and other factors. Although we have collected some ratings, it is not easy to make a choice among the many factors weighed, and it is not practical to try each one individually. Shortly after, Tensorflow was open sourced from Google and we didn't hesitate to pick it up.

For one, TF has all the features we asked for (C++/Python language support, GPU, etc.). More importantly, we believe that the platform launched by Google will soon be accepted by everyone and quickly form a corresponding development ecosystem and active development of subsequent updates. Subsequent facts have confirmed our expectations. The table below compares several popular platforms, with data from a paper published in the arXiv in February of this year.

MIT's newest programming language, Milk, accelerates parallel computing in the era of big data

This week MIT is the latest to release its new programming language Milk, a new programming language that enables four times faster processing than existing languages when it comes to big data.

This week at the International Conference on Parallel Architectures and Compilation Techniques, researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) unveiled a new programming language, Milk, that allows application developers to manage memory more efficiently in problems dealing with discrete data points in large data sets.

By testing several general-purpose algorithms, programs written in the new language Milk achieved speeds up to four times faster than existing programming languages. But the researchers believe better results can be achieved with further research.

Saman Amarasinghe, a professor of electrical engineering and computer science, says that the reason large data sets pose a problem for existing memory management techniques today is not only because of their enormous size, but more because they are sparse. That is, the size of the problem solution does not necessarily increase in proportion to the size of the problem.

OpenAI at the Third Wave teaches you how to build an infrastructure for deep learning research

In this paper, OpenAI research engineers Vicki Cheung, Jonas Schneider , Ilya Sutskever, and Greg Brockman share the infrastructure (software, hardware, configuration, and compilation) needed to work on Deep Learning research, giving examples of how to automatically scale network models in deep learning research using the open source Kubernetes-ec2-autoscaler, which will help a wide range of deep learning research enthusiasts build their own deep learning infrastructure.

Deep learning is an empirical science, and the infrastructure of a research team will have a significant impact on future research efforts. Fortunately, today's open source ecosystem can equip anyone with the ability to build a more complete deep learning infrastructure.

In this post, we will provide an overview of how deep learning research is typically conducted, describe our choice of infrastructure in order to support deep learning research, and open source Kubernetes-ec2-autoscaler, a batch-optimized scaling manager for Kubernetes. We hope that this article helps you build your own deep learning infrastructure.

Replica Twitter open source Lua/Torch based reinforcement learning framework torch-twrl

Torch has been around for a decade, but really took off thanks to Facebook open-sourcing a number of Torch's deep learning modules and extensions last year. Another special feature of Torch is the use of the not-so-popular programming language Lua (which was used to develop video games). And today.

The goal of augmented learning algorithms (intelligences) has always been to learn to perform complex, novel tasks by interacting with the task (environment). In order to develop effective algorithms, it is crucial to iterate and test quickly, and torch-twrl is open as expected.

Drawing on other augmented learning frameworks, torch-twrl aims to provide.

An augmented learning framework with minimal function dependencies in Lua/Torch. Define clear, modular code (to facilitate rapid development). Interfaces seamlessly with Open AI's augmented learning benchmarking framework, Gym.

Can't get enough of the above? 15 Open Source AI Software Counted One by One, Which One Is Your Thing?

Artificial intelligence is one of the hottest areas of research right now. Large companies such as IBM, Google, Microsoft, Facebook and Amazon have not only increased funding for their development research departments, but have also started acquiring startups that have made small inroads in machine learning, neural networks, natural language and image processing. Given the current explosion in the field of AI research, professors at Stanford University made this report not long ago: "The role of AI software is becoming increasingly powerful, and AI software that has a strong impact on human society and the economy will be available by 2030."

Foreign website Datamation today collated the current popular 15 open source artificial intelligence software, Thunderbolt compiled the full text to introduce.


Recommended>>
1、Machine Learning Algorithms at a Glance PPT
2、ExpandableListView simple application and listview mock ExpandableListView
3、How can neural networks be implemented with pure SQL query statements
4、Build a private docker image repository
5、How to use less in vue

    已推荐到看一看 和朋友分享想法
    最多200字,当前共 发送

    已发送

    朋友将在看一看看到

    确定
    分享你的想法...
    取消

    分享想法到看一看

    确定
    最多200字,当前共

    发送中

    网络异常,请稍后重试

    微信扫一扫
    关注该公众号