Accepted Papers by NTT Laboratories and NTT Research to Showcase Artificial Intelligence (AI) and Machine Learning (ML) Research Breakthroughs at Prestigious International Conference
SUNNYVALE, Calif.–(BUSINESS WIRE)–#TechforGood—NTT Laboratories and NTT Research, Inc., a division of NTT (TYO:9432), today announced that its researchers, including members of its Human Informatics (HI) Labs, Computer & Data Science (CD) Labs, Communication Science (CS) Labs and NTT Research’s Cryptography & Information Security (CIS) Lab authored or co-authored seven papers that have been accepted for presentation at NeurIPS (Neural Information Processing Systems) 2022, one of the leading international conferences on AI and ML. Four additional papers from scientists in NTT Research’s CIS Lab and Physics & Informatics (PHI) Lab were accepted for related workshops. The first week of this year’s event will take place in New Orleans, followed by a virtual component during the second week, spanning from November 28th through December 9th.
The NeurIPS 2022 program committee, comprised of more than 60 senior area chairs and hundreds of experts, accepted 2,672 out of 10,411 submissions this year – an acceptance rate of only 25.6%. Details of the conference schedule and featured NTT papers are outlined below:
- Hideaki Kim and Taichi Asami of HI labs, along with Hiroyuki Toda of Yokohama City University presented their paper, titled “Fast Bayesian Estimation of Point Process Intensity as Function of Covariates,” on Tuesday, Nov. 29th, at 2pm (PST). In their paper, the researchers tackle the Bayesian estimation of point process intensity as a function of covariates and propose a novel augmentation of Gaussian Cox process to derive a fast estimation algorithm that scales linearly with data size. They evaluate their algorithm using synthetic and real-world data, showing that it outperforms state-of-the-art methods in terms of predictive accuracy.
- Daiki Chijiwa, Shinya Yamaguchi, Atsutoshi Kumagai and Yasutoshi Ida of CD Labs will present their paper, titled “Meta-ticket: Finding optimal subnetworks for few-shot learning within randomly initialized neural networks,” on Wednesday, Nov. 30th, at 2pm (PST). In their paper, they empirically show the existence of sparse deep neural network (DNN) structures that are less prone to overfit small datasets, which can be identified by meta-learning with weight-pruning. They also show that the meta-learned sparse structures can be effectively used in various domains. As a result, they help improve overlifting problems that arise when DNNs learn from small amounts of data.
- Masaaki Nishino, Kengo Nakamura and Norihito Yasuda of CS Labs will present their paper, titled “Generalization Analysis on Learning with a Concurrent Verifier,” on Wednesday, Nov. 30th, at 9am (PST). Their research proposes a machine learning model with a verifier that can guarantee that the prediction result using the machine learning model which satisfies the given specification, and theoretically analyzes how the generalization performance of the model changes by using the verifier.
- Sanjam Garg of NTT Research in collaboration with co-authors Somesh Jha, Saeed Mahloujifar, Mohammad Mahmoody and Mingyuan Wang will present their paper, titled “Overparameterization from Computational Constraints,” on Wednesday, Nov. 30th, at 2pm (PST). Their paper asks whether the need for large, overparameterized models is due in part to the limitations of the learner, and whether the situation is exacerbated for robust (efficient) learning. The authors show that efficient learning could provably need more parameters than inefficient learning.
- Tomoharu Iwata of CS Labs and Atsutoshi Kumagai of CD Labs will present their paper, titled “Sharing Knowledge for Meta-learning with Feature Descriptions,” on Wednesday, Nov. 30th, at 2pm (PST). Their paper proposes a meta-learning method that learns how to learn models using data with descriptions from various tasks. The proposed method achieves high predictive performance with a little training data in unseen tasks.
- Atsutoshi Kumagai and Yasutoshi Ida of CD Labs, along with Tomoharu Iwata of CS Labs will present their paper, titled “Few-shot Learning for Feature Selection with Hilbert-Schmidt Independence Criterion,” on Thursday, Dec. 1st, at 9am (PST). Their paper proposes a few-shot learning method for supervised feature selection. Their method improves the feature selection performance on a small amount of data by using the information of related datasets.
- Yusuke Tanaka, Tomoharu Iwata and Yasuhiro Fujiwara of NTT CS Labs will present their paper, titled “Symplectic Spectrum Gaussian Processes: Learning Hamiltonians from Noisy and Sparse Data,” on Thursday, Dec. 1st, at 2pm (PST). Their paper proposes a Gaussian process model that incorporates the theory of Hamiltonian mechanics. Experiments on several physical systems show that the proposed model can accurately predict dynamics that follow the energy conservation or dissipation law from noisy and sparse data.
In addition, four workshop papers authored or co-authored by NTT Research, Inc. scientists have been accepted for presentation at the conference. The papers include:
“What shapes the loss landscape of self-supervised learning?”
by Liu Ziyin, Ekdeep Singh Lubana, Masahito Ueda and Hidenori Tanaka
(NeurIPS 2022 Workshop: Self-Supervised Learning – Theory and Practice)
“Geometric Considerations for Normalization Layers in Equivariant Neural Networks”
by Max Aalto, Ekdeep S. Lubana and Hidenori Tanaka
(NeurIPS 2022 Workshop: AI for Accelerated Materials Design)
“Mechanistic Lens on Mode Connectivity”
by Ekdeep Singh Lubana, Eric J. Bigelow, Robert Dick, David Krueger and Hidenori Tanaka
(NeurIPS 2022 Workshop: Distribution Shifts Connecting Methods and Applications)
“Training physical networks like neural networks: deep physical neural networks”
by Logan G. Wright, Tatsuhiro Onodera, Martin M. Stein, Tianyu Wang, Darren T. Schachter, Zoey Hu and Peter L. McMahon (NeurIPS 2022 Workshop: Machine Learning and the Physical Sciences)
NTT’s R&D is centered around the IOWN (Innovative Optical and Wireless Network) concept, a future communication infrastructure that aims to optimize the individual and the whole through sustainable, environmentally friendly growth and tolerance for diversity. Together with NTT Group operating companies and people in various industries, we will work to solve various social issues and realize a smart world where people can enjoy the benefits of unobtrusive technology. We will continue to research and develop technologies that will transform the world and uphold our value of diversity.
About NTT Research
NTT Research opened its offices in July 2019 as a new Silicon Valley startup to conduct basic research and advance technologies that promote positive change for humankind. Currently, three labs are housed at NTT Research facilities in Sunnyvale: the Physics and Informatics (PHI) Lab, the Cryptography and Information Security (CIS) Lab, and the Medical and Health Informatics (MEI) Lab. The organization aims to upgrade reality in three areas: 1) quantum information, neuroscience and photonics; 2) cryptographic and information security; and 3) medical and health informatics. NTT Research is part of NTT, a global technology and business solutions provider with an annual R&D budget of $3.6 billion.
NTT is a global technology and business solutions provider helping clients accelerate growth and innovate digital business models. We provide digital business consulting, technology and managed services for cybersecurity, applications, workplace, cloud, data center and networks – all supported by our deep industry expertise and innovation. As a top-five global IT services provider, our diverse teams deliver services in 190+ countries and regions. We serve 85% of the Fortune Global 100 companies and thousands of other clients and communities. With a 120-year heritage of service and social responsibility, we advocate and act for our clients and a sustainable world. For more information on NTT, visit https://www.global.ntt/.
NTT and the NTT logo are registered trademarks or trademarks of NIPPON TELEGRAPH AND TELEPHONE CORPORATION and/or its affiliates. All other referenced product names are trademarks of their respective owners. © 2022 NIPPON TELEGRAPH AND TELEPHONE CORPORATION