Human Generated Data

Title

University of Baghdad

Date

c. 1957

People

Artist: Unidentified Artist,

Classification

Archival Material

Human Generated Data

Title

University of Baghdad

People

Artist: Unidentified Artist,

Date

c. 1957

Classification

Archival Material

Machine Generated Data

Tags

Amazon

Human 99.2
Person 99.2
Person 98.5
Person 97.2
Person 94.7
Person 93.8
Nature 93.8
Person 90.2
Art 85.2
Drawing 85.2
Text 83.1
Person 81.2
Outdoors 80.2
Building 69.4
Person 68.7
Person 67.8
Sketch 64
People 63.8
Urban 62.1
Person 61.4
Office Building 56.7
Crowd 55.4
Person 52

Clarifai

people 99.8
group 99.2
print 99
illustration 98.7
many 97.7
adult 96.4
vehicle 96.4
home 95.2
wear 93.3
art 93.2
man 93.2
group together 93.1
street 92
war 90.8
building 90.1
military 88
furniture 87.7
no person 86.2
cavalry 85.8
weapon 85.3

Imagga

sketch 44.1
drawing 34.6
building 30.5
newspaper 27.6
architecture 27.5
old 26.5
representation 23.9
city 23.3
vintage 22.3
grunge 21.3
product 21.2
antique 18.2
wall 17.2
texture 16.7
creation 16.5
tourism 16.5
window 16.3
house 15.9
ancient 15.6
travel 15.5
art 15
structure 15
retro 14.8
street 14.7
historic 14.7
landmark 14.4
facade 14.3
frame 14.2
stone 14
pattern 13.7
aged 13.6
design 13.5
paper 13.3
urban 13.1
dirty 12.7
history 12.5
grungy 12.3
shop 12.2
decoration 12.2
famous 12.1
black 12
snow 12
space 11.6
rough 10.9
daily 10.9
paint 10.9
barbershop 10.7
weathered 10.4
tourist 10.1
border 10
mercantile establishment 9.7
damaged 9.5
door 9.2
brown 8.8
home 8.8
text 8.7
attraction 8.6
dirt 8.6
worn 8.6
film 8.6
culture 8.5
historical 8.5
exterior 8.3
style 8.2
material 8
textured 7.9
lamp 7.6
old fashioned 7.6
destination 7.5
negative 7.4
town 7.4
digital 7.3
business 7.3
tower 7.2

Google

Paper 69.5
Paper product 50.3

Microsoft

text 99.5
old 87.2
house 82.1
drawing 75.1
building 57

Face analysis

Amazon

AWS Rekognition

Age 48-68
Gender Male, 50.8%
Confused 45.2%
Surprised 45.1%
Happy 45.2%
Angry 45.4%
Sad 52%
Disgusted 46.2%
Calm 45.8%

AWS Rekognition

Age 14-23
Gender Male, 50.2%
Sad 49.6%
Confused 49.7%
Disgusted 49.5%
Surprised 49.6%
Angry 49.6%
Happy 49.5%
Calm 50%

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

an old photo of a book 52%
an old photo of a person 51.9%
an old photo of a person holding a book 41.9%

Text analysis

Amazon

TYPICAL
INTERIOR
65
DORMITOR
IES,
38/A IES, TYPICAL
page
38/A
COURT
29ll2

Google

3 272
3
272