User profiles for Itamar Friedman
![]() | Itamar FriedmanCEO at Qodo (fka CodiumAI) Verified email at codium.ai Cited by 2049 |
Asymmetric loss for multi-label classification
In a typical multi-label setting, a picture contains on average few positive labels, and many
negative ones. This positive-negative imbalance dominates the optimization process, and can …
negative ones. This positive-negative imbalance dominates the optimization process, and can …
Tresnet: High performance gpu-dedicated architecture
Many deep learning models, developed in recent years, reach higher ImageNet accuracy
than ResNet50, with fewer or comparable FLOPs count. While FLOPs are often seen as a …
than ResNet50, with fewer or comparable FLOPs count. While FLOPs are often seen as a …
Code generation with alphacodium: From prompt engineering to flow engineering
T Ridnik, D Kredo, I Friedman - arXiv preprint arXiv:2401.08500, 2024 - arxiv.org
Code generation problems differ from common natural language problems - they require
matching the exact syntax of the target language, identifying happy paths and edge cases, …
matching the exact syntax of the target language, identifying happy paths and edge cases, …
Graph embedded pose clustering for anomaly detection
We propose a new method for anomaly detection of human actions. Our method works directly
on human pose graphs that can be computed from an input video sequence. This makes …
on human pose graphs that can be computed from an input video sequence. This makes …
Asymmetric loss for multi-label classification
In a typical multi-label setting, a picture contains on average few positive labels, and many
negative ones. This positive-negative imbalance dominates the optimization process, and can …
negative ones. This positive-negative imbalance dominates the optimization process, and can …
Xnas: Neural architecture search with expert advice
This paper introduces a novel optimization method for differential neural architecture search,
based on the theory of prediction with expert advice. Its optimization criterion is well fitted …
based on the theory of prediction with expert advice. Its optimization criterion is well fitted …
Asap: Architecture search, anneal and prune
Automatic methods for Neural ArchitectureSearch (NAS) have been shown to produce state-of-the-art
network models, yet, their main drawback is the computational complexity of the …
network models, yet, their main drawback is the computational complexity of the …
Multi-label classification with partial annotations using class-aware selective loss
Large-scale multi-label classification datasets are commonly, and perhaps inevitably,
partially annotated. That is, only a small subset of labels are annotated per sample. Different …
partially annotated. That is, only a small subset of labels are annotated per sample. Different …
Semantic diversity learning for zero-shot multi-label classification
Training a neural network model for recognizing multiple labels associated with an image,
including identifying unseen labels, is challenging, especially for images that portray …
including identifying unseen labels, is challenging, especially for images that portray …
Knapsack pruning with inner distillation
Neural network pruning reduces the computational cost of an over-parameterized network
to improve its efficiency. Popular methods vary from $\ell_1$-norm sparsification to Neural …
to improve its efficiency. Popular methods vary from $\ell_1$-norm sparsification to Neural …