Skip to main content

Table 2 Summary of the reviewed work on analysis techniques

From: A survey on automatic techniques for enhancement and analysis of digital photography

Authors

Image sources

Set size

Main goal

Assess. method: used metrics

Results

Liu et al. [90]

Photosig [10], NUS-WIDE [29], Kodak [172]

1,300,000

CBIR

Obj.: precision

14.5 %

Li et al. [83]

NI

70,000

ROI Detection

Obj.: precision and Recall

NI

Pang et al. [117]

Flickr [50]

50,000

Grouping

Sub.: scaled (1–5)

Average rank \(>\) 4

Sinha [141]

Flickr [50], Picasa [58]

40,000

Grouping

Obj.: JS Divergence

JS Div. \(<\) 0.3

Tong et al. [152]

Corel [32], MS

29,540

Assess.: home user x photographer

Obj.: MSE

11.1

Marshall [99]

MIR FLICKR 25000 [51]

25,000

CBIR

NI

NI

O’Hare [113]

Own

23,774

Grouping

Obj.: H-hit rule

NI

Dao et al. [37]

Picasa [58]

19,101

Grouping

Obj.: F-Measure

NI

Luo et al.[94]

Web

17,613

Assess.: high x low quality

Obj.: accuracy

95 %

Yao et al. [175]

Photo.net [60]

13,302

Assess.: ranking

Sub.: scale (0–100)

75.33 %

Ke et al. [73]

DpChallenge [20]

12,000

Assess.

Sub.: scale (1–10)

72 %

Yeh et al. [177]

DpChallenge [20], Flickr [50]

12,000

Assess.: ranking

Sub.: scale (1–10)

81 %

Yeh et al. [176]

DpChallenge [20], Flickr [50]

12,000

Assess.: ranking

Sub.: scale (1–10)

93 %

Sandnes [132]

Own

7,672

Grouping

Obj.: accuracy

88.1 %

Su et al. [145]

DpChallenge [20]

6,000

Assess.

Sub.: scale (1–10)

92.06 %

Boutell et al. [9]

Corel [32]/Own

5,770

Class.: sunset

accuracy

96.4 %

Singla et al. [140]

Own

4,500

Summ.

Obj.: precision and Recall

NI

Oliveira et al. [114]

Web

3,700

Class.: photo x graphic

Obj.: cross-validation

95.6 %

Datta et al. [39, 41]

Photo.net [60]

3,581

Assess.: ranking

Sub.: scale (1–7)

70.12 %

Obrador et al. [111]

Photo.net [60]

3,141

Class.: high x low aesthetics

Sub.: scale (1–7)

66.5 %

Zhang et al. [187]

Own

2,597

Grouping

NI

NI

Tong et al. [151]

Corel [32]

2,355

Class.: blur

Obj.: Accuracy

98.6 %

Obrador [108]

NI

2,000

Assess.: ranking

Sub.: 6 grades

37.5 %

Shen et al. [139]

Web, Flickr [50]

2,000

Detect.: dissection lines

Sub.: TP + FP

80.87 and 33.61 %

Cooper et al. [30]

Own

1,449

Class.: event

Obj.: F-Measure

0.8568

Serrano et al. [137]

Web

1,200

Class.: indoor x Outdoor

Obj.: accuracy

90.2 %

Chu et al. [26]

Own

1,199

Grouping

Obj.: precision

0.68

Chu et al. [27, 28]

Flickr [50]

1,024

Grouping

Sub.: scale (1–5)

Satisfaction \(>\) 4

Tang et al. [148]

Picasa [58]

975

Grouping

Obj.: precision and Recall

NI

Loui et al. [92]

NI

943

Grouping

Sub.: correlation

0.84

Lo Presti et al. [121]

Gallagher [54]

589

Retrieval

Obj.: error rate

27.68 %

Kim et al. [75]

Own

564

Grouping

Obj.: Precision at Top-N

MAP \(>\) 0.4

Li et al. [81]

Flickr [50]

500

Assess.: ranking

Sub.: choice

51 %

Li et al. [80]

Flickr [50]

500

Assess. & Class

Sub.: scale (0–10)

Residual sum-of-squares: 2.38

Khan et al. [74]

Li et al. [81]

500

Assess.: ranking

Sub.: choice

61.10 %

Jiang et al. [71]

Flickr [50], Kodak [172], Own

450

Assess.: ranking

Sub.: scale 0–100

MSE \(<\) 17

Obrador et al. [110]

Own

200

Grouping

Sub.: choice

75 %