Human Generated Data

Title

Untitled (young men using weight-lifting apparatus)

Date

c. 1960

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10774

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (young men using weight-lifting apparatus)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.8
Human 99.8
Person 99.8
Person 99.8
Person 99.2
Person 98.2
Acrobatic 96.5
Person 81.4
Person 81.3
Leisure Activities 70.9
Athlete 58.8
Gymnast 58.8
Gymnastics 58.8
Sport 58.8
Sports 58.8
Working Out 57.3
Exercise 57.3
Person 56.8

Imagga
created on 2022-01-15

stage 75.3
platform 57
person 25.2
people 24
male 20.6
man 20.1
equipment 17.9
body 16
weight 15.2
active 15.1
group 14.5
modern 14
sport 13.8
adult 13.2
health 13.2
room 12.7
interior 12.4
business 12.1
black 12
technology 11.9
exercise 11.8
sports equipment 11.8
lifestyle 11.6
gym 11.5
club 11.3
men 11.2
fitness 10.8
human 10.5
work 10.2
fit 10.1
gymnasium 10
music 10
concert 9.7
sexy 9.6
performer 9.5
player 9.4
training 9.2
city 9.1
indoor 9.1
hand 9.1
portrait 9.1
dancer 9
hall 8.9
teacher 8.9
businessman 8.8
education 8.7
table 8.6
glass 8.6
athlete 8.5
holding 8.3
professional 8.2
healthy 8.2
working 7.9
medical 7.9
indoors 7.9
classroom 7.9
standing 7.8
model 7.8
blackboard 7.7
train 7.7
crowd 7.7
dumbbell 7.6
happy 7.5
musician 7.5
inside 7.4
light 7.3
ball 7.3
bass 7.3
student 7.2
success 7.2
school 7.2
women 7.1

Microsoft
created on 2022-01-15

Face analysis

Amazon

Google

AWS Rekognition

Age 18-26
Gender Female, 51.9%
Calm 79.2%
Disgusted 4.6%
Angry 3.6%
Sad 3.4%
Happy 3.4%
Surprised 2.7%
Confused 2.4%
Fear 0.7%

AWS Rekognition

Age 21-29
Gender Male, 74.4%
Calm 99.9%
Sad 0.1%
Happy 0%
Confused 0%
Disgusted 0%
Angry 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 41-49
Gender Male, 93.6%
Calm 87%
Sad 6.5%
Happy 2.7%
Confused 2%
Angry 0.6%
Surprised 0.6%
Disgusted 0.4%
Fear 0.2%

AWS Rekognition

Age 23-33
Gender Male, 99.4%
Surprised 57.1%
Calm 19.7%
Disgusted 5.3%
Happy 5%
Sad 4.4%
Confused 3.3%
Angry 2.9%
Fear 2.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a group of people standing in a room 87.4%
a group of people playing instruments and performing on a stage 61.7%
a group of people in a room 61.6%

Text analysis

Amazon

56800.
VT27082
DIAGOY

Google

56800
56800