Human Generated Data

Title

Africans Dancing in the Street

Date

1832

People

Artist: Ferdinand-Victor-Eugène Delacroix, French 1798 - 1863

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Grenville L. Winthrop, 1943.349

Human Generated Data

Title

Africans Dancing in the Street

People

Artist: Ferdinand-Victor-Eugène Delacroix, French 1798 - 1863

Date

1832

Classification

Drawings

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Grenville L. Winthrop, 1943.349

Machine Generated Data

Tags

Amazon
created on 2020-04-25

Human 99.6
Person 99.6
Person 98.8
Art 97.4
Painting 97.4
Person 95.1
Person 75.3
Person 72.5
Person 68.8
Leisure Activities 63.2
Musician 58.9
Musical Instrument 58.9
Duel 58.2
Lute 55.1

Clarifai
created on 2020-04-25

print 99.8
illustration 99.8
people 99.7
art 99.7
group 99.5
painting 99.4
cavalry 98.7
lithograph 98.2
man 97.9
mammal 96.7
transportation system 95.9
many 95.6
seated 94.9
adult 94.7
herder 94.4
camel 93.2
sword 90.9
weapon 90.6
military 89.2
woman 88.9

Imagga
created on 2020-04-25

graffito 100
decoration 85.8
old 25.8
armor 22.7
shield 21.5
travel 18.3
stone 18.1
architecture 18
culture 17.9
ancient 17.3
traditional 16.6
art 15.7
religion 15.2
statue 15.2
history 15.2
tourism 14.8
holiday 14.3
wall 13.7
costume 13.5
protective covering 12.8
mask 12.7
covering 12.6
city 12.5
man 12.1
famous 12.1
historic 11.9
soldier 11.7
sculpture 11.6
vintage 11.6
antique 10.4
historical 10.3
building 10.3
tourist 10.1
temple 9.7
military 9.7
urban 9.6
religious 9.4
grunge 9.4
tradition 9.2
aged 9
texture 9
person 8.8
warrior 8.8
war 8.7
medieval 8.6
monument 8.4
people 8.4
dark 8.3
vacation 8.2
dirty 8.1
landmark 8.1
fantasy 8.1
detail 8
face 7.8
sitting 7.7
helmet 7.7
festival 7.7
color 7.2

Google
created on 2020-04-25

Microsoft
created on 2020-04-25

painting 99.3
drawing 99.2
text 98.2
book 93.1
sketch 92.2
child art 84.8
person 84.1
clothing 70.9
cartoon 66.5
several 13.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 42-60
Gender Male, 54.9%
Sad 45.2%
Angry 45.1%
Calm 45%
Confused 45.1%
Happy 45%
Surprised 47.8%
Disgusted 45%
Fear 51.8%

AWS Rekognition

Age 39-57
Gender Male, 54.5%
Fear 45.1%
Surprised 47.5%
Angry 51.8%
Sad 45%
Calm 45.3%
Happy 45.2%
Disgusted 45%
Confused 45%

AWS Rekognition

Age 26-40
Gender Female, 50.1%
Fear 49.5%
Confused 49.5%
Happy 49.5%
Calm 50%
Sad 49.8%
Surprised 49.5%
Disgusted 49.5%
Angry 49.7%

AWS Rekognition

Age 21-33
Gender Male, 50.5%
Angry 49.8%
Fear 49.5%
Happy 49.5%
Surprised 49.6%
Calm 49.8%
Confused 49.6%
Disgusted 49.5%
Sad 49.6%

AWS Rekognition

Age 5-15
Gender Female, 53.8%
Happy 45.1%
Confused 45.4%
Angry 47.1%
Calm 46.3%
Surprised 45.7%
Disgusted 45%
Fear 46.4%
Sad 49.1%

AWS Rekognition

Age 17-29
Gender Female, 51.5%
Surprised 45%
Confused 45%
Angry 45.1%
Fear 45%
Calm 52.8%
Disgusted 45%
Sad 47%
Happy 45.1%

AWS Rekognition

Age 22-34
Gender Female, 52.6%
Angry 47.6%
Happy 46.1%
Fear 45.2%
Calm 46.5%
Surprised 45.1%
Disgusted 45%
Sad 49.4%
Confused 45%

Feature analysis

Amazon

Person 99.6%
Painting 97.4%

Categories

Imagga

pets animals 99.9%