Human Generated Data

Title

Untitled (two women kicking their legs in front of trailer)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7386

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two women kicking their legs in front of trailer)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 99.1
Person 99.1
Person 98
Dance Pose 96.9
Leisure Activities 96.9
Dance 88.7
Sport 79.6
Sports 79.6
Person 76.7
Performer 69.9
Athlete 66
Acrobatic 60
Ballet 58.9
Ballerina 57.3

Imagga
created on 2022-01-08

equipment 25.9
backboard 20.6
daily 20.1
ball 19.7
adult 19.4
newspaper 18.3
people 17.8
person 17.7
sport 17.1
basketball 16.4
fitness 16.3
man 14.8
athlete 14.7
game equipment 14.5
black 14.4
product 14.1
lifestyle 13.7
sexy 12.8
outdoors 12.7
beach 12.6
portrait 12.3
summer 12.2
male 12.1
body 12
attractive 11.9
basketball equipment 11.7
sports equipment 11.5
creation 11.3
pretty 11.2
exercise 10.9
outdoor 10.7
outside 10.3
sky 10.2
street 10.1
model 10.1
healthy 10.1
water 10
active 9.9
human 9.7
lady 9.7
action 9.3
clothing 9.3
danger 9.1
health 9
billboard 8.8
women 8.7
bikini 8.7
athletic 8.6
sea 8.6
wall 8.5
sand 8.3
fit 8.3
sign 8.3
vacation 8.2
sunset 8.1
coast 8.1
sun 8
posing 8
hair 7.9
face 7.8
swimsuit 7.8
travel 7.7
jumping 7.7
youth 7.7
concrete 7.7
skin 7.6
fashion 7.5
dark 7.5
fun 7.5
style 7.4
light 7.3
signboard 7.3
freedom 7.3
figure 7.2
dress 7.2
road 7.2
structure 7.1
day 7.1

Google
created on 2022-01-08

Shorts 79.1
Font 76.3
Monochrome 72.2
Monochrome photography 70.7
Recreation 70.3
Knee 68.5
Advertising 68.2
Balance 64.6
Stock photography 62.9
Billboard 61.7
Fun 60.9
Art 59.8
Photo caption 55
Sports 53.3
Human leg 52.1
Signage 51.3

Microsoft
created on 2022-01-08

text 99.9
outdoor 96.3
sign 90.6
dance 85.2
person 72.3
black and white 68.3
white 60.4
clothing 53.7

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Female, 52.5%
Calm 98.9%
Surprised 0.5%
Happy 0.3%
Sad 0.1%
Disgusted 0.1%
Confused 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 36-44
Gender Male, 99.9%
Surprised 65.4%
Happy 21%
Calm 8.9%
Confused 2.2%
Sad 0.7%
Fear 0.7%
Disgusted 0.7%
Angry 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a person holding a sign posing for the camera 75.7%
a person holding a sign 75.6%
a person holding a sign posing for the camera 75.5%

Text analysis

Amazon

THE
ENDAS
RASOTA,
19214
LA
'AI
WT.6000.

Google

THE
RASOTA
THE AL ENDAS RASOTA LA WT0000. 19214.
AL
LA
WT0000.
ENDAS
19214.