Furious AI Researcher Creates Site Shaming Non-Reproducible Machine Learning Papers

The Next Web tells the story of an AI researcher who discovered the results of a machine learning research paper couldn't be reproduced. But then they'd heard similar stories from Reddit's Machine Learning forum: "Easier to compile a list of reproducible ones...," one user responded. "Probably 50%-75% of all papers are unreproducible. It's sad, but it's true," another user wrote. "Think about it, most papers are 'optimized' to get into a conference. More often than not the authors know that a paper they're trying to get into a conference isn't very good! So they don't have to worry about reproducibility because nobody will try to reproduce them." A few other users posted links to machine learning papers they had failed to implement and voiced their frustration with code implementation not being a requirement in ML conferences. The next day, ContributionSecure14 created "Papers Without Code," a website that aims to create a centralized list of machine learning papers that are not implementable... Papers Without Code includes a submission page, where researchers can submit unreproducible machine learning papers along with the details of their efforts, such as how much time they spent trying to reproduce the results... If the authors do not reply in a timely fashion, the paper will be added to the list of unreproducible machine learning papers.

Read more of this story at Slashdot.



from Slashdot https://ift.tt/3cgVFns

SUBSCRIBE TO OUR NEWSLETTER

“Work hard in silence, let your success be your noise"

0 Response to "Furious AI Researcher Creates Site Shaming Non-Reproducible Machine Learning Papers"

Post a Comment

ad

Search Your Job