Human Generated Data

Title

Untitled (soup on route)

Date

c. 1914-1918

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Richard E. Bennink, 2.2002.4138

Human Generated Data

Title

Untitled (soup on route)

People

Artist: Unidentified Artist,

Date

c. 1914-1918

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-12

Book 100
Human 97.9
Person 97.9
Person 96.6
Person 95.9
Person 95.3
Person 94.5
Person 90.7
Person 89.3
Person 82.9
Person 81.3
Art 76.4
Text 72.5
Person 71.3
People 71
Person 71
Person 69.8
Person 66.8
Person 66.8
Person 65.3
Military 63.3
Military Uniform 63.3
Crowd 62.4
Indoors 62
Person 60.5
Drawing 56.4
Soldier 56.2

Imagga
created on 2022-03-12

bookend 74.7
support 57
device 44.1
book jacket 31.5
old 28.5
shelf 24.8
jacket 24.5
vintage 21.5
book 19.3
antique 19
wrapping 18.6
money 17.8
paper 17.2
grunge 17
binding 16.7
retro 16.4
covering 15.7
cash 15.5
finance 15.2
texture 14.6
currency 14.3
design 14.1
financial 13.3
art 13.2
dollar 13
ancient 13
banking 12.9
bank 12.5
business 12.1
building 12
black 11.4
brown 11
historic 11
architecture 10.9
frame 10.8
religion 10.7
exchange 10.5
bill 10.4
wall 10.3
investment 10.1
market 9.8
damaged 9.5
dirty 9
wealth 9
history 8.9
material 8.9
pattern 8.9
banknote 8.7
hundred 8.7
education 8.7
empty 8.6
grungy 8.5
savings 8.4
rich 8.4
tourism 8.2
aged 8.1
graphic 8
interior 8
decoration 8
carved 7.8
budget 7.8
color 7.8
door 7.7
dollars 7.7
debt 7.7
payment 7.7
temple 7.6
religious 7.5
one 7.5
document 7.4
style 7.4
stack 7.4
structure 7.3
object 7.3
open 7.2
home 7.2

Google
created on 2022-03-12

Microsoft
created on 2022-03-12

wall 99
text 96.4
art 94
indoor 91.7
drawing 89.2
book 77.7
museum 64.6
old 59.8
different 36.8
clothes 23.5

Face analysis

Amazon

AWS Rekognition

Age 24-34
Gender Male, 89%
Calm 97.9%
Sad 1.4%
Angry 0.4%
Confused 0.1%
Disgusted 0.1%
Surprised 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 22-30
Gender Male, 100%
Calm 95.4%
Angry 1.3%
Sad 1.3%
Confused 1.1%
Disgusted 0.5%
Surprised 0.2%
Happy 0.1%
Fear 0.1%

AWS Rekognition

Age 22-30
Gender Male, 64.3%
Sad 87.8%
Calm 6.5%
Happy 2.3%
Disgusted 1.5%
Angry 0.5%
Confused 0.5%
Surprised 0.4%
Fear 0.4%

AWS Rekognition

Age 26-36
Gender Male, 97.2%
Calm 98.4%
Fear 0.6%
Sad 0.4%
Angry 0.2%
Disgusted 0.1%
Happy 0.1%
Confused 0.1%
Surprised 0.1%

AWS Rekognition

Age 25-35
Gender Male, 96.1%
Calm 97.6%
Sad 1.3%
Angry 0.5%
Confused 0.2%
Disgusted 0.1%
Fear 0.1%
Happy 0.1%
Surprised 0.1%

AWS Rekognition

Age 35-43
Gender Female, 62.1%
Happy 59.1%
Calm 25%
Sad 5.5%
Fear 3.4%
Surprised 2.5%
Angry 1.8%
Disgusted 1.5%
Confused 1.2%

AWS Rekognition

Age 18-26
Gender Female, 81.3%
Calm 76.1%
Happy 10.7%
Angry 5.5%
Sad 2.8%
Disgusted 1.7%
Confused 1.5%
Surprised 1.1%
Fear 0.4%

AWS Rekognition

Age 20-28
Gender Female, 78.5%
Calm 99.6%
Happy 0.1%
Sad 0.1%
Surprised 0.1%
Confused 0.1%
Disgusted 0%
Angry 0%
Fear 0%

Feature analysis

Amazon

Book 100%
Person 97.9%

Captions

Microsoft

a group of people posing for a photo 76.2%
an old photo of a group of people posing for the camera 73.2%
a group of people posing for the camera 73.1%

Text analysis

Amazon

LA
EN
SOUPE
ROUT
241. LA SOUPE EN ROUT
S.T.L.
241.

Google

241. LA SOUPE EN ROUT
241.
ROUT
LA
SOUPE
EN