Human Generated Data

Title

Untitled (men picking through a bin of tomatoes)

Date

1947

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5490

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men picking through a bin of tomatoes)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Hat 99.7
Clothing 99.7
Apparel 99.7
Human 99.6
Person 99.6
Person 99.3
Person 98.7
Person 98
Person 97.6
Person 97.4
Person 96.9
Person 95.2
Furniture 95
Chair 94.8
Person 94.7
Person 94.5
Person 93
Person 89.5
Person 88.9
Meal 84.3
Food 84.3
Room 69
Indoors 69
Couch 64.4
Person 63.5
Sitting 60.4
Text 59.9
Cafeteria 59.8
Restaurant 59.8
Cafe 55.3
People 55.3
Chair 52.6
Person 50.7
Person 46

Imagga
created on 2022-01-23

people 38.5
person 35
man 34.9
businessman 34.4
male 34
business 30.3
meeting 28.2
group 26.6
room 24.5
businesswoman 23.6
adult 23.6
happy 23.2
office 22.8
together 22.8
men 22.3
classroom 21
businesspeople 20.9
team 20.6
newspaper 20.6
work 20.4
teacher 20.3
teamwork 19.5
desk 18.9
table 18.2
smiling 18.1
executive 18
working 17.7
colleagues 17.5
30s 17.3
corporate 17.2
sitting 17.2
product 16.6
job 15.9
indoors 15.8
planner 15.8
professional 15.7
women 15
cheerful 14.6
talking 14.2
senior 14
couple 13.9
computer 13.6
chair 13.5
communication 13.4
blackboard 13.3
worker 13.1
hall 13
education 13
creation 12.9
coworkers 12.8
casual 12.7
teaching 12.7
student 12.3
40s 11.7
discussion 11.7
class 11.6
boy 11.3
modern 11.2
laptop 11
portrait 11
20s 11
day 11
indoor 10.9
restaurant 10.9
lifestyle 10.8
discussing 10.8
suit 10.8
to 10.6
success 10.5
home 10.4
school 10.3
manager 10.2
mature 10.2
happiness 10.2
conference 9.8
new 9.7
cooperation 9.7
studying 9.6
four 9.6
daily 9.6
hands 9.5
females 9.5
two 9.3
back 9.2
successful 9.1
board 9
black 9
building 8.9
color 8.9
associates 8.8
educator 8.8
students 8.8
employee 8.7
ethnic 8.6
smile 8.5
friends 8.4
finance 8.4
presentation 8.4
company 8.4
handsome 8
four people 7.9
collaboration 7.9
casual clothing 7.8
child 7.7
diversity 7.7
hand 7.6
staff 7.6
adults 7.6
career 7.6
horizontal 7.5
learning 7.5
human 7.5
looking 7.2
face 7.1

Google
created on 2022-01-23

Outerwear 95.1
Shirt 94.9
Hat 93.4
Fedora 88.6
Sun hat 87.2
Chair 86
Coat 85.8
Black-and-white 85.6
Style 83.8
Cap 82
Headgear 81.7
Suit 80.6
T-shirt 80.6
Monochrome 74.5
Snapshot 74.3
Monochrome photography 74.2
Cowboy hat 72.4
Font 70.5
Event 70.4
Sitting 68.1

Microsoft
created on 2022-01-23

person 99.8
text 95.4
clothing 94.8
black and white 87
outdoor 86
man 71.6
table 59.4
furniture 57.7
people 56.8
chair 54.5

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Male, 99.4%
Calm 99%
Happy 0.7%
Surprised 0.1%
Confused 0.1%
Sad 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 51-59
Gender Male, 97.4%
Calm 99.9%
Happy 0%
Sad 0%
Surprised 0%
Confused 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 34-42
Gender Male, 79%
Calm 39%
Confused 32%
Happy 14.3%
Sad 7.6%
Surprised 3.1%
Disgusted 1.7%
Fear 1.1%
Angry 1.1%

AWS Rekognition

Age 47-53
Gender Male, 98.1%
Calm 99.7%
Surprised 0.1%
Sad 0.1%
Angry 0.1%
Confused 0%
Disgusted 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 45-53
Gender Female, 56.7%
Calm 98.3%
Sad 0.7%
Angry 0.3%
Happy 0.3%
Confused 0.2%
Surprised 0.1%
Fear 0.1%
Disgusted 0%

AWS Rekognition

Age 40-48
Gender Female, 91.2%
Calm 90.8%
Happy 6.5%
Confused 1%
Sad 1%
Disgusted 0.3%
Surprised 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 29-39
Gender Male, 99.8%
Calm 55.6%
Sad 31%
Angry 8.1%
Happy 1.6%
Surprised 1.4%
Confused 1%
Disgusted 0.7%
Fear 0.6%

AWS Rekognition

Age 48-56
Gender Female, 91.9%
Calm 56.9%
Sad 34.5%
Confused 2.8%
Happy 2.3%
Surprised 1.1%
Angry 1.1%
Disgusted 0.7%
Fear 0.7%

AWS Rekognition

Age 45-51
Gender Female, 81%
Calm 85.6%
Sad 9.6%
Confused 2.4%
Angry 1.2%
Disgusted 0.4%
Surprised 0.3%
Happy 0.3%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Hat 99.7%
Person 99.6%
Chair 94.8%

Captions

Microsoft

a group of people standing in front of a crowd 85%
a group of people standing in front of a crowd of people 83.2%
a group of people in front of a crowd 83.1%

Text analysis

Amazon

KEY,
SIESTA
ULL
J.
23058
STEINMETZ,
J. J. STEINMETZ, SIESTA KEY, SARASOTA,-FLA.
58
F ULL
SARASOTA,-FLA.
F
23058.

Google

STEINMETZ,
23058. JLL 23058 J. J. STEINMETZ, SIESTA KEY, SARASOTA, FLA.
J.
SARASOTA,
23058.
23058
JLL
KEY,
FLA.
SIESTA