Human Generated Data

Title

Untitled (woman jumping rope)

Date

1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8254

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman jumping rope)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1938

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.5
Human 99.5
Apparel 99.5
Clothing 99.5
Shorts 99.1
Hat 68.3
Sport 67.5
Sports 67.5
Racket 61.5
Tennis Racket 61.5
Tennis 59.7
Sleeve 56.7
Tennis Court 56.2
Sphere 55.9

Imagga
created on 2022-02-05

skateboard 63.3
wheeled vehicle 52.3
board 48.3
vehicle 36.8
beach 28.2
conveyance 25.9
sport 25.3
people 25.1
man 24.2
outdoors 24.1
person 24
lifestyle 22.4
sky 22.3
adult 22.3
summer 21.9
outdoor 20.7
leisure 19.9
active 19.8
sea 19.6
male 18.4
exercise 18.2
fun 18
happy 16.9
ocean 16.8
happiness 16.5
athlete 16.4
sunset 16.2
sand 15.9
freedom 15.6
fitness 15.4
joy 15
action 14.8
sun 14.5
water 14
sexy 13.7
jumping 13.5
one 13.4
attractive 13.3
portrait 12.9
healthy 12.6
activity 12.5
running 11.5
enjoy 11.3
pretty 11.2
outside 11.1
skate 11.1
day 11
teenager 10.9
playing 10.9
model 10.9
recreation 10.8
dancer 10.7
vacation 10.6
fashion 10.6
lady 10.6
health 10.4
sunny 10.3
youth 10.2
dress 9.9
human 9.8
jump 9.6
body 9.6
men 9.4
clouds 9.3
clothing 9.2
silhouette 9.1
boy 8.7
smile 8.6
performer 8.4
relaxation 8.4
hand 8.4
alone 8.2
travel 7.8
cloud 7.8
life 7.7
motion 7.7
relax 7.6
energy 7.6
vitality 7.5
movement 7.5
shore 7.4
relaxing 7.3
coast 7.2
businessman 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 98.9
black and white 86.9
black 84.3
player 78.7
white 76.1
old 65.5
vintage 26.9

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Female, 56.4%
Calm 82.1%
Happy 11.6%
Sad 3.6%
Angry 0.8%
Disgusted 0.7%
Surprised 0.7%
Fear 0.3%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a vintage photo of a baseball player holding a racket 58.7%
a vintage photo of a baseball player holding a bat 50%
a vintage photo of a person 49.9%

Text analysis

Amazon

7673
A70A
MJ17
MJ17 ЭТАЯТIИ A70A
ЭТАЯТIИ

Google

AFƏA
7673 7673 MI3 3TARTIM AFƏA
MI3
7673
3TARTIM