Human Generated Data

Title

Untitled (chefs in a kitchen standing behind a table of turkeys)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7741

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (chefs in a kitchen standing behind a table of turkeys)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Restaurant 99.9
Person 99.4
Human 99.4
Person 99.3
Person 99
Cafeteria 98.5
Person 98.2
Person 98.2
Person 97.3
Person 95.6
Cafe 88.5
Meal 82
Food 82
Indoors 72.1
Housing 71.5
Building 71.5
Interior Design 61.7
Airplane 60.9
Transportation 60.9
Vehicle 60.9
Aircraft 60.9
Room 59.4
Shop 56.6

Imagga
created on 2022-01-09

restaurant 48
cafeteria 45.3
interior 41.6
building 33.5
room 28.5
modern 26.6
furniture 25.7
chair 25.5
structure 25.2
table 24.9
business 21.3
design 19.7
inside 19.3
indoors 18.5
shop 18.3
work 18
counter 17.7
light 16.7
architecture 16.4
house 15.9
indoor 15.5
office 15.4
lamp 15.3
floor 14.9
center 14.7
equipment 14.2
steel 14.1
people 13.9
empty 12.9
kitchen 12.8
home 12.8
chairs 12.7
wood 12.5
3d 12.4
barbershop 12.4
glass 11.7
mercantile establishment 11.6
decor 11.5
comfortable 11.5
man 11.4
contemporary 11.3
window 11.2
travel 10.6
urban 10.5
dining 10.5
bar 10.2
transportation 9.9
apartment 9.6
luxury 9.4
industry 9.4
lights 9.3
city 9.1
classroom 8.8
lifestyle 8.7
living 8.5
seat 8.5
hall 8.4
place of business 8.3
occupation 8.2
salon 8.2
plant 8.2
stylish 8.1
decoration 8
device 7.9
sofa 7.7
airport 7.6
technology 7.4
style 7.4
reflection 7.3
station 7.3
life 7.3
metal 7.2
male 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

indoor 95.5
text 92.9
black and white 92.6
person 87.5
restaurant 67.3
table 65.8
clothing 61.6
worktable 14.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Male, 100%
Sad 88.9%
Calm 4.4%
Confused 3%
Happy 2.7%
Surprised 0.4%
Disgusted 0.3%
Angry 0.2%
Fear 0.2%

AWS Rekognition

Age 43-51
Gender Male, 99.9%
Fear 44.8%
Calm 42.5%
Happy 9.8%
Sad 0.8%
Confused 0.7%
Disgusted 0.6%
Angry 0.4%
Surprised 0.3%

AWS Rekognition

Age 36-44
Gender Male, 99.5%
Calm 89.6%
Happy 6.9%
Sad 1.6%
Fear 0.8%
Confused 0.3%
Angry 0.3%
Surprised 0.2%
Disgusted 0.2%

AWS Rekognition

Age 34-42
Gender Male, 63%
Happy 52.2%
Calm 39.9%
Sad 1.7%
Angry 1.7%
Surprised 1.3%
Confused 1.2%
Fear 1%
Disgusted 1%

AWS Rekognition

Age 49-57
Gender Male, 73.5%
Happy 95.9%
Sad 1.7%
Surprised 1.3%
Calm 0.5%
Confused 0.2%
Angry 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 35-43
Gender Male, 82.5%
Calm 47.6%
Happy 21.2%
Sad 10.5%
Angry 8.7%
Confused 5.3%
Disgusted 3.3%
Fear 1.8%
Surprised 1.5%

AWS Rekognition

Age 35-43
Gender Male, 96.7%
Calm 78.7%
Fear 10%
Happy 6.3%
Surprised 1.8%
Confused 1.1%
Sad 0.8%
Angry 0.7%
Disgusted 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.4%
Airplane 60.9%

Captions

Microsoft

a group of people standing in front of a store 66.9%
a group of people in a room 66.8%
a group of people in front of a store 64.7%

Text analysis

Amazon

39220

Google

372 20
20
372