Codesota · Models · UniXcoderMicrosoft3 results · 3 benchmarks
Model card

UniXcoder.

Microsoftopen-sourceUnknown paramsTransformer encoder-decoder

Unified cross-modal pre-training for code. ACL 2022.

§ 01 · Benchmarks

Every benchmark UniXcoder has a recorded score for.

#BenchmarkArea · TaskMetricValueRankDateSource
01Bugs2FixComputer Code · Bug Detectionaccuracy66.4%#5/62022-03-07source ↗
02CodeSearchNetComputer Vision · Optical Character Recognitionbleu-419.1%#6/72022-03-07source ↗
03codesearchnet---javascriptComputer Vision · Optical Character Recognitionsmoothed-bleu-415.5%#7/142024-07-01source ↗
Rank column shows this model’s position vs all other models scored on the same benchmark + metric (competitors after the slash). #1 in red means current SOTA. Sorted by rank, then newest result.
§ 02 · Strengths by area

Where UniXcoder actually performs.

Computer Code
1
benchmark
avg rank #5.0
Computer Vision
2
benchmarks
avg rank #6.5
§ 03 · Papers

2 papers with results for UniXcoder.

  1. 2024-07-01· 1 result

    ESALE: Enhancing Code-Summary Alignment Learning for Source Code Summarization

  2. 2022-03-07· Computer Code· 2 results

    UniXcoder: Unified Cross-Modal Pre-Training for Code Representation

§ 04 · Related models

Other Microsoft models scored on Codesota.

RAD-DINO
2 results · 1 SOTA
NaturalSpeech 3
~500M params · 1 result · 1 SOTA
Swin Transformer V2 Large
197M params · 1 result · 1 SOTA
WavLM Large (SV)
316M params · 1 result · 1 SOTA
ResNet-50
25M params · 3 results
Florence-2-Large
2 results
KOSMOS-2.5
2 results
ResNet-152
60M params · 2 results
§ 05 · Sources & freshness

Where these numbers come from.

arxiv
2
results
ESALE paper (IEEE TSE 2024)
1
result
3 of 3 rows marked verified. · first result 2022-03-07, latest 2024-07-01.