We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.CV

Change to browse by:

cs

References & Citations

DBLP - CS Bibliography

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Computer Vision and Pattern Recognition

Title: ProTeCt: Prompt Tuning for Hierarchical Consistency

Abstract: Large visual-language models, like CLIP, learn generalized representations and have shown promising zero-shot performance. Few-shot adaptation methods, based on prompt tuning, have also been shown to further improve performance on downstream datasets. However, these models are not hierarchically consistent. Frequently, they infer incorrect labels at coarser taxonomic class levels, even when the inference at the leaf level (original class labels) is correct. This is problematic, given their support for open set classification and, in particular, open-grained classification, where practitioners define label sets at various levels of granularity. To address this problem, we propose a prompt tuning technique to calibrate the hierarchical consistency of model predictions. A set of metrics of hierarchical consistency, the Hierarchical Consistent Accuracy (HCA) and the Mean Treecut Accuracy (MTA), are first proposed to benchmark model performance in the open-granularity setting. A prompt tuning technique, denoted as Prompt Tuning for Hierarchical Consistency (ProTeCt), is then proposed to calibrate classification across all possible label set granularities. Results show that ProTeCt can be combined with existing prompt tuning methods to significantly improve open-granularity classification performance without degradation of the original classification performance at the leaf level.
Subjects: Computer Vision and Pattern Recognition (cs.CV)
Cite as: arXiv:2306.02240 [cs.CV]
  (or arXiv:2306.02240v1 [cs.CV] for this version)

Submission history

From: Wu Tz-Ying [view email]
[v1] Sun, 4 Jun 2023 02:55:25 GMT (2969kb,D)
[v2] Thu, 28 Mar 2024 05:35:46 GMT (3467kb,D)

Link back to: arXiv, form interface, contact.