Human Generated Data

Title

Untitled (group of children gathered around restaurant counter)

Date

1947

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7019

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of children gathered around restaurant counter)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.8
Human 99.8
Person 98.9
Person 98.7
Person 98.5
Person 98.4
Person 97.4
Building 96.8
Person 96.5
Person 96
Person 95.3
Person 90.5
Person 90.1
Person 89
Factory 87.9
Cafeteria 74.1
Restaurant 74.1
Assembly Line 68.7
Person 68.6
Crowd 62.5
Person 59.5
Manufacturing 59.5
Housing 59.3
Furniture 59
Table 59
Tabletop 57.8
Room 57.7
Indoors 57.7
People 56

Imagga
created on 2021-12-15

newspaper 37.7
product 29.8
currency 25.1
person 24.6
creation 24.1
money 23.8
business 22.5
finance 20.3
cash 19.2
dollar 18.5
bill 18.1
bank 17.9
people 17.3
planner 16.5
hundred 15.5
slick 15.4
daily 15.3
exchange 15.3
banking 14.7
man 14.2
adult 13.6
comic book 13.4
financial 13.3
old 13.2
senior 13.1
male 12.8
paper 12.5
group 12.1
wealth 11.7
rich 11.2
technology 11.1
investment 11
businessman 10.6
looking 10.4
economy 10.2
office 9.9
market 9.8
portrait 9.7
us 9.6
payment 9.6
pay 9.6
design 9.6
grunge 9.4
stock 9.3
sign 9
smiling 8.7
education 8.6
loan 8.6
elderly 8.6
student 8.5
classroom 8.4
savings 8.4
teacher 8.4
hand 8.3
letter 8.2
happy 8.1
aged 8.1
symbol 8.1
team 8.1
computer 8
postage 7.9
work 7.8
banknotes 7.8
bills 7.8
banknote 7.8
stamp 7.7
dollars 7.7
blackboard 7.7
notes 7.7
mail 7.6
room 7.6
human 7.5
vintage 7.4
web site 7.4
closeup 7.4
note 7.3
professional 7.3
smile 7.1
copy 7.1
working 7.1
modern 7

Google
created on 2021-12-15

Hairstyle 95.1
Photograph 94.2
Black 89.8
Table 86.6
Black-and-white 86.1
Chair 85.7
Style 84.1
Art 79.2
Monochrome 75.1
Snapshot 74.3
Monochrome photography 74.1
Event 72
Recreation 69.8
Room 69.6
Design 68.4
Hat 68.2
Suit 67.7
Illustration 66.8
Pattern 65.4
Stock photography 65.1

Microsoft
created on 2021-12-15

person 99.9
text 96.4
clothing 91.2
drawing 81.2
cartoon 73.2
woman 70.8
group 65.6
player 62.5
man 55
posing 54.7

Face analysis

Amazon

Google

AWS Rekognition

Age 25-39
Gender Male, 94.3%
Calm 91.8%
Angry 3%
Confused 2.5%
Sad 1.7%
Surprised 0.4%
Disgusted 0.3%
Happy 0.2%
Fear 0%

AWS Rekognition

Age 29-45
Gender Male, 80.7%
Calm 98.2%
Sad 1.2%
Surprised 0.4%
Confused 0.1%
Angry 0.1%
Disgusted 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 23-37
Gender Female, 78.7%
Calm 86.5%
Happy 5.5%
Surprised 5.4%
Angry 1.4%
Confused 0.5%
Disgusted 0.3%
Sad 0.3%
Fear 0.1%

AWS Rekognition

Age 14-26
Gender Female, 95.2%
Calm 90.7%
Sad 7.9%
Happy 0.8%
Confused 0.3%
Surprised 0.1%
Angry 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 43-61
Gender Female, 96.7%
Sad 57%
Calm 41.5%
Happy 0.7%
Angry 0.3%
Confused 0.3%
Fear 0.1%
Disgusted 0.1%
Surprised 0%

AWS Rekognition

Age 10-20
Gender Female, 85.6%
Calm 80%
Happy 14.6%
Sad 3.4%
Disgusted 0.6%
Angry 0.5%
Confused 0.3%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 17-29
Gender Male, 56.6%
Calm 95.5%
Sad 4%
Happy 0.3%
Angry 0.1%
Surprised 0.1%
Confused 0%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 24-38
Gender Male, 72.7%
Sad 77.5%
Calm 19.2%
Happy 1.1%
Confused 1%
Surprised 0.9%
Angry 0.2%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 44-62
Gender Female, 51.2%
Calm 86.9%
Happy 10.6%
Sad 2%
Confused 0.2%
Disgusted 0.1%
Angry 0.1%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 23-37
Gender Female, 74.4%
Calm 64%
Sad 19.2%
Happy 13.9%
Surprised 1.3%
Confused 0.7%
Angry 0.4%
Fear 0.4%
Disgusted 0.1%

AWS Rekognition

Age 19-31
Gender Male, 66.3%
Calm 97.9%
Sad 1.4%
Fear 0.2%
Happy 0.2%
Confused 0.1%
Surprised 0.1%
Angry 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a group of people posing for a photo 86.8%
a group of people posing for the camera 86.7%
a group of people posing for a picture 86.6%

Text analysis

Amazon

CROWN
COLA
ARE
OYAL
22432
ARE DUE
DUE
KUB
OYAL and CROWN
EVEY'S
ABLE
SMOKING
and

Google

FTIMIST
EMEY
ARE BUET 22432 ABLE FTIMIST EMEY OYAL CROWN COLA
CROWN
22432
ARE
BUET
ABLE
OYAL
COLA