Human Generated Data

Title

Untitled (dancer on stage)

Date

c. 1950

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20218

Human Generated Data

Title

Untitled (dancer on stage)

People

Artist: Peter James Studio, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Furniture 98.9
Chair 98.9
Apparel 98.2
Clothing 98.2
Human 96.5
Person 96.5
Shorts 85.8
Dress 85.1
Person 83.4
Female 80.7
Helmet 80.1
Suit 72.7
Coat 72.7
Overcoat 72.7
Building 68.5
Leisure Activities 67.1
Woman 66.7
Face 65.8
Photo 65.8
Portrait 65.8
Photography 65.8
Table 64.6
Floor 63.9
Indoors 63
Architecture 62.6
Path 60.8
Plant 60.4
Palace 58.9
Housing 58.9
House 58.9
Mansion 58.9
Fashion 58.3
Robe 58.3
Text 58.1
Wheel 56.8
Machine 56.8
Gown 56.4
Crowd 56.2
Dining Table 55.8
Urban 55.6
Dance Pose 55.4
Bridegroom 55.4
Wedding 55.4

Imagga
created on 2022-03-05

stick 26.3
hockey stick 25.7
man 20.2
sports equipment 19.6
sport 19
equipment 17.9
people 17.3
adult 16.8
person 15.7
portrait 15.5
outside 15.4
black 15
device 14.8
lifestyle 14.4
athlete 14.4
male 14.3
fitness 13.5
outdoors 13.4
exercise bike 12.8
body 12.8
play 12.1
fun 12
ball 11.9
attractive 11.9
exercise 11.8
city 11.6
wheeled vehicle 11.5
youth 11.1
exercise device 11
competition 11
outdoor 10.7
happy 10
leisure 10
recreation 9.9
building 9.6
sexy 9.6
urban 9.6
pretty 9.1
fashion 9
player 8.9
day 8.6
sitting 8.6
dark 8.3
training 8.3
active 8.3
one 8.2
playing 8.2
dress 8.1
sunset 8.1
women 7.9
park 7.8
life 7.8
shopping cart 7.8
travel 7.7
summer 7.7
basketball 7.7
sky 7.6
health 7.6
power 7.6
fit 7.4
danger 7.3
sun 7.2
game 7.1
posing 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

black and white 95.2
street 92.2
text 85.6
person 78.7
clothing 70.1
cartoon 63.5

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Female, 99.2%
Calm 94.1%
Surprised 1.6%
Disgusted 1%
Fear 1%
Angry 0.8%
Sad 0.5%
Confused 0.5%
Happy 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.5%
Helmet 80.1%
Wheel 56.8%

Captions

Microsoft

a person sitting on a bench 69.8%
a person standing in front of a building 69.7%
a person riding on the back of a bench 48.9%

Text analysis

Amazon

٢ад
YТ3А°-AОX

Google

YT37A°2-XAON
YT37A°2-XAON