Human Generated Data

Title

Untitled (chefs standing behind large buffet, Azure Tides)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10498

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (chefs standing behind large buffet, Azure Tides)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10498

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 98.6
Person 98.6
Person 98.1
Person 97.8
Person 89.5
Clothing 88.9
Apparel 88.9
Person 87.1
Market 84.8
Person 84
Bazaar 80.9
Shop 80.9
Person 80.2
Person 78.7
Food 74.8
Meal 74.8
Person 67.5
People 66.3
Dish 65.3
Person 41.7

Clarifai
created on 2023-10-25

people 99.9
many 98.7
group 98.7
man 97.2
woman 97
adult 96.9
group together 91.2
leader 87.4
military 86.9
food 84.5
interaction 84.1
administration 82.7
monochrome 80.1
child 80
one 80
war 77.9
indoors 77.4
art 77.1
wear 75.8
several 74.7

Imagga
created on 2022-01-09

crown 30.3
glass 23.6
crown jewels 20.6
water 18.7
black 17.1
light 13.4
celebration 12.8
fountain 12.3
crystal 11.4
grunge 11.1
blackboard 10.9
structure 10.8
design 10.7
night 10.7
party 10.3
regalia 10.3
pattern 10.3
transparent 9.8
cold 9.5
elegant 9.4
decoration 9.4
ice 9.4
holiday 9.3
close 9.1
drop 9.1
texture 9
art 8.9
cool 8.9
symbol 8.7
frozen 8.6
menorah 8.5
color 8.3
reflection 8.3
vintage 8.3
flowers 7.8
old 7.7
case 7.6
chandelier 7.5
drink 7.5
splash 7.5
style 7.4
retro 7.4
digital 7.3
music 7.2
shredder 7.2
religion 7.2
wet 7.1
snow 7.1
headdress 7.1
device 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98.5
indoor 93.8
person 90.5
black and white 71.5
several 15.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 42-50
Gender Male, 99.8%
Calm 52.4%
Sad 25.8%
Happy 9.9%
Angry 4.8%
Disgusted 2.1%
Surprised 2%
Confused 1.5%
Fear 1.5%

AWS Rekognition

Age 33-41
Gender Male, 99.9%
Happy 87.8%
Calm 6.3%
Sad 2.7%
Fear 1.9%
Disgusted 0.4%
Confused 0.4%
Angry 0.3%
Surprised 0.2%

AWS Rekognition

Age 35-43
Gender Male, 99.9%
Calm 97.5%
Happy 0.7%
Fear 0.6%
Sad 0.3%
Surprised 0.2%
Confused 0.2%
Angry 0.2%
Disgusted 0.2%

AWS Rekognition

Age 23-31
Gender Male, 85.5%
Calm 99.8%
Sad 0.1%
Surprised 0%
Angry 0%
Confused 0%
Disgusted 0%
Fear 0%
Happy 0%

AWS Rekognition

Age 37-45
Gender Male, 100%
Calm 82.7%
Happy 13%
Sad 1.4%
Confused 1.1%
Surprised 0.6%
Disgusted 0.5%
Angry 0.4%
Fear 0.2%

AWS Rekognition

Age 37-45
Gender Male, 85.9%
Calm 89.9%
Happy 8.2%
Surprised 1%
Fear 0.3%
Angry 0.2%
Sad 0.2%
Confused 0.1%
Disgusted 0.1%

AWS Rekognition

Age 35-43
Gender Male, 95.9%
Sad 45%
Calm 17.9%
Disgusted 17.1%
Confused 10.6%
Fear 4.1%
Surprised 2.3%
Happy 1.5%
Angry 1.4%

Feature analysis

Amazon

Person 98.6%