Human Generated Data

Title

Untitled (circus performers seated around table near fireplace)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7388

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (circus performers seated around table near fireplace)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.6
Human 99.6
Furniture 99.3
Person 99.3
Person 98.9
Person 98.2
Interior Design 87.4
Indoors 87.4
Room 86.2
Living Room 85.3
People 84.8
Person 84.3
Chair 81
Apparel 76
Clothing 76
Person 73
Shoe 70.3
Footwear 70.3
Dining Room 68.7
Table 65
Face 64.3
Meal 61.3
Food 61.3
Photo 60.8
Photography 60.8
Text 58.8
Chair 57.4
Dining Table 56.5
Advertisement 55.1
Shoe 52.7

Imagga
created on 2022-01-08

man 33.6
blackboard 28.1
people 27.3
male 23.4
person 21.7
room 20.4
businessman 17.6
classroom 17.4
education 15.6
men 15.4
business 15.2
table 14.5
adult 14.3
meeting 14.1
sitting 13.7
hand 12.9
student 12.9
human 12.7
team 12.5
hall 12
smiling 11.6
restaurant 11.4
teacher 11.2
couple 10.4
technology 10.4
office 10.1
indoor 10
happy 10
interior 9.7
group 9.7
work 9.6
love 9.5
two 9.3
teamwork 9.3
modern 9.1
professional 9.1
portrait 9
worker 8.8
antique 8.8
chair 8.8
class 8.7
lifestyle 8.7
vintage 8.6
college 8.5
black 8.4
case 8.4
old 8.3
school 8.2
retro 8.2
cheerful 8.1
religion 8.1
science 8
looking 8
working 7.9
businesspeople 7.6
coat 7.3
glass 7.3
board 7.2
smile 7.1
women 7.1
family 7.1
medical 7.1
happiness 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.7
furniture 93.1
fireplace 92
table 87.3
clothing 78.7
person 75.6
chair 74.2
old 65.4
white 62.5

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Male, 98.4%
Calm 68.9%
Sad 28.3%
Confused 1.7%
Angry 0.4%
Disgusted 0.3%
Happy 0.2%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 40-48
Gender Female, 56.9%
Calm 88.6%
Sad 3.4%
Angry 3.4%
Happy 3.1%
Confused 0.6%
Surprised 0.4%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 23-33
Gender Female, 92.2%
Calm 61.1%
Sad 31.8%
Fear 2.2%
Confused 1.5%
Angry 1.2%
Happy 0.9%
Disgusted 0.7%
Surprised 0.6%

AWS Rekognition

Age 23-31
Gender Female, 68.1%
Sad 42.1%
Calm 29.2%
Confused 15.6%
Happy 6%
Angry 2.9%
Surprised 1.7%
Disgusted 1.3%
Fear 1.2%

AWS Rekognition

Age 47-53
Gender Female, 71.6%
Calm 73.4%
Sad 16.8%
Fear 2.1%
Happy 2.1%
Surprised 1.8%
Confused 1.4%
Angry 1.3%
Disgusted 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Chair 81%
Shoe 70.3%

Captions

Microsoft

an old photo of a group of people sitting at a table 76.5%
a group of people in an old photo of a person 68.9%
an old photo of a group of people sitting around a table 68.8%

Text analysis

Amazon

19064.B
ISA
HAPPY
YYSTAR

Google

4.B
1906 4.B
1906