Human Generated Data

Title

Untitled (people seated at tables under awnings)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7662

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (people seated at tables under awnings)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7662

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98.4
Human 98.4
Person 97.7
Person 97.7
Person 97.4
Person 97
Person 96.8
Person 95.6
Person 95.4
Person 95.1
Crowd 94.6
Person 93
Person 91.9
Audience 90.2
Person 89.5
Person 89.2
Person 84.1
People 77.4
Person 71.2
Person 70.4
Cafeteria 67.7
Restaurant 67.7
Person 65.9
Person 65.2
Person 65.1
Photography 64.2
Photo 64.2
Sitting 62.7
Suit 62.4
Clothing 62.4
Coat 62.4
Overcoat 62.4
Apparel 62.4
Meal 59
Food 59
Indoors 57.8
Person 56.4
Face 55.8
Funeral 55.2

Clarifai
created on 2023-10-25

people 99.9
many 99.8
group 99.5
group together 98.1
man 97
woman 96.9
administration 96.4
adult 96.2
crowd 94.7
child 92.9
audience 92.7
leader 91
war 91
music 89.9
recreation 89.3
furniture 82
boy 81.7
spectator 81.4
chair 79.2
family 76.3

Imagga
created on 2022-01-08

people 31.8
man 30.9
male 28.3
person 23.5
senior 22.5
couple 21.8
group 19.3
happy 18.2
old 17.4
men 17.2
business 15.2
room 15
adult 14.5
school 13.9
photographer 13.7
businessman 13.2
smiling 13
student 12.4
classroom 12.4
together 12.3
home 11.2
teacher 11.1
women 11.1
portrait 11
building 10.3
silhouette 9.9
hand 9.9
team 9.8
success 9.6
life 9.6
musical instrument 9.6
elderly 9.6
spectator 9.5
love 9.5
meeting 9.4
happiness 9.4
aged 9
black 9
looking 8.8
retired 8.7
professional 8.7
busy 8.7
education 8.7
sitting 8.6
cheerful 8.1
handsome 8
family 8
child 7.8
work 7.8
crowd 7.7
brass 7.6
two 7.6
kin 7.6
city 7.5
mature 7.4
indoor 7.3
entrepreneur 7.3
blackboard 7.2
job 7.1
indoors 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

person 99
text 95.7
black and white 78.5
group 73.9
clothing 71.4
man 70.7
people 58.2
crowd 4.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 96.7%
Calm 99.2%
Sad 0.6%
Happy 0%
Angry 0%
Confused 0%
Disgusted 0%
Fear 0%
Surprised 0%

AWS Rekognition

Age 23-33
Gender Male, 93.4%
Calm 88.8%
Sad 6.4%
Happy 2.4%
Confused 0.8%
Surprised 0.6%
Disgusted 0.5%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 24-34
Gender Male, 64.6%
Calm 64.1%
Sad 32.4%
Disgusted 1.1%
Confused 0.8%
Happy 0.7%
Surprised 0.5%
Angry 0.2%
Fear 0.2%

AWS Rekognition

Age 29-39
Gender Male, 90.3%
Happy 98.1%
Calm 1.2%
Sad 0.2%
Angry 0.1%
Confused 0.1%
Surprised 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 21-29
Gender Female, 64.4%
Calm 98.4%
Sad 0.9%
Disgusted 0.2%
Angry 0.1%
Happy 0.1%
Confused 0.1%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 27-37
Gender Male, 96.6%
Calm 85.4%
Sad 11.6%
Angry 0.8%
Surprised 0.7%
Disgusted 0.4%
Happy 0.4%
Confused 0.4%
Fear 0.3%

AWS Rekognition

Age 19-27
Gender Female, 97.8%
Calm 44.9%
Sad 35.3%
Happy 17.3%
Fear 0.7%
Confused 0.6%
Disgusted 0.5%
Angry 0.4%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 98.4%

Text analysis

Amazon

19160.
17160
17160.
NACOX

Google

O ררנ 160.
O
ררנ
160.