Next week we're in Bangkok, Thailand, as a platinum sponsor of the 62nd Annual Meeting of the Association for Computational Linguistics (ACL 2024). Find the Amazon team at booth 18 to learn more about our research, workshops, and career opportunities in this area: https://lnkd.in/efdifzXG #ACL2024 #MachineLearning #NLP
Amazon Science
Research Services
Seattle, Washington 363,364 followers
The latest news and research from Amazon’s science community. #AmazonScience
About us
Amazon Science gives you insight into the company’s approach to customer-obsessed scientific innovation. Amazon fundamentally believes that scientific innovation is essential to being the most customer-centric company in the world. It’s the company’s ability to have an impact at scale that allows us to attract some of the brightest minds in artificial intelligence and related fields. Our scientists continue to publish, teach, and engage with the academic community, in addition to utilizing our working backwards method to enrich the way we live and work. Follow us on LinkedIn and visit our website to get a deep dive on innovation at Amazon, and explore the many ways you can engage with our scientific community. #AmazonScience
- Website
-
https://www.amazon.science
External link for Amazon Science
- Industry
- Research Services
- Company size
- 10,001+ employees
- Headquarters
- Seattle, Washington
- Founded
- 2020
- Specialties
- Artificial Intelligence, Machine Learning, Computer Vision, Cloud, Economics, Sustainability, AI, ML, Conversational AI, Natural Language Processing, NLP, Robotics, Security, Privacy, Information, Knowledge Management, Operations, Scientific Research, Search, Amazon, and Alexa
Updates
-
Amazon Science reposted this
A core tenet for #AWS has always been that, if it was possible to take care of scaling, resilience, availability, security, and cost management of our customer's infrastructure components, we should do so. Eventually, this became known as "serverless", which encompasses much more than executing code without the need for servers. Amazon SQS and S3 were the first AWS services launched, and they were "serverless" from day one. A good example of this philosophy in practice is Aurora Serverless, which provides on-demand auto-scaling for Amazon Aurora databases. It proactively scales up resources during peak periods to ensure the predictable performance when you require it. When demand subsides, it seamlessly scales back down, reducing waste. And with granular, second-by-second billing, you can truly optimize for cost. Remember, the essence of frugal architecture is robust monitoring driving the ability to optimize costs. To dive deeper into how Aurora Serverless exemplifies this principle, I recommend reading the latest paper here: https://lnkd.in/e4uPK6wv #AWS #serverless #databases
-
-
Amazon Science reposted this
We’ve made a big step forward in Just Walk Out technology that makes our checkout-free service more accurate, and easily scalable to new locations globally. Using a new multi-modal foundation model, Amazon is furthering our JWO technology accuracy by analyzing data from cameras and sensors throughout the store simultaneously, instead of looking at which items shoppers pick up and put back in a linear sequence. Our new multi-modal foundation model also reduces the rare need for model retraining in unfamiliar shopping scenarios, as the self-learning system continues to teach itself—all while protecting your privacy. Customers are really going to enjoy these advancements in our contactless checkout: https://lnkd.in/gRHZygYs
-
-
For all their success, LLMs have trouble following prescribed sequences of operations, whether operational workflows or API dependencies. At this year's North American Chapter of the Association for Computational Linguistics (NAACL), Amazon researchers showed how to address this using dependency graphs and constrained decoding. #ConversationalAI #LLMs #NAACL2024
-
Congrats to Amazon’s Middle Mile, Planning Research Optimization Science team on making finalists at the INFORMS 2024 Revenue Management & Pricing Section Award for their work on developing a cutting-edge ML-based dynamic pricing model for Amazon Freight. Leveraging a multi-stage machine learning algorithm, the team’s state-of-the-art system optimizes pricing in real-time, adapting to complex data patterns and evolving customer needs. #INFORMS2024
-
-
If you're enjoying ICML 2024 as much as we are, check out Amazon's career opportunities in machine learning, where we have full-time roles in science teams across the world, part-time roles for academics who want to work on large-scale technical challenges while they continue to teach, and internships for students who want to gain real-world experience. Explore our jobs: https://lnkd.in/e2Z-7SbM #ICML2024 #ML
-
-
The fight against hallucination in retrieval-augmented-generation models starts with a method for accurately assessing it. In a paper that Amazon scientists are presenting this week at #ICML2024, they introduce a pioneering methodology that employs an automated exam generation process, enhanced by item response theory, to evaluate the factual accuracy of #RAG models on specific tasks. Their approach is not only robust and interpretable but also cost efficient, strategically identifying model strengths and refining exams to optimize their evaluative utility. #LLMs #MachineLearning
Automated evaluation of RAG pipelines with exam generation
amazon.science
-
Meet the recipients of the 2024 Amazon Robotics Day One Fellowships—a program established to support exceptionally talented students from diverse technical and cultural backgrounds who are pursuing master-of-science degrees. The fully-funded fellowship provides recipients with the opportunity to participate in the Amazon Fulfillment Technologies & Robotics internship program and receive mentorship from experts in their chosen field. #Robotics #Internships #DEI
Amazon Robotics names 2024 Day One Fellowship Program recipients
amazon.science
-
LLM training documents are usually combined and then split into fixed-size chunks, but the resulting document truncation hurts performance. At [ICML] Int'l Conference on Machine Learning, Amazon researchers adapt the idea of bin-packing to combine documents so as to minimize truncation. Learn more about the method in our latest blog post and download the paper. #ICML2024 #LLMs #GenerativeAI
Improving LLM pretraining with better data organization
amazon.science
-
This week we're in Vienna, Austria, as a platinum sponsor of the International Conference on Machine Learning (ICML 2024). Find the Amazon team at booth 115 to learn more about our ML research and career opportunities. #ICML2024 #MachineLearning #AmazonScience
Amazon Science at ICML 2024
amazon.science