Human Generated Data

Title

Untitled (woman taking bunches of asparagus from crates)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5114

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman taking bunches of asparagus from crates)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.4
Human 99.4
Clothing 95.8
Apparel 95.8
Female 92.8
Face 92.5
Woman 76.3
Building 68.6
Girl 68.6
Outdoors 67.9
People 67.2
Nature 67.1
Portrait 67
Photography 67
Photo 67
Urban 63.5
Shorts 57.5

Imagga
created on 2022-01-23

motor vehicle 38.6
person 28.3
golf equipment 28.2
people 24
wheeled vehicle 23.1
car 22.7
attractive 21.7
sitting 21.5
sports equipment 21.1
laptop 20.1
vehicle 20
adult 19.4
pretty 18.2
happy 17.5
smile 16.4
work 15.8
portrait 15.5
man 15.4
equipment 15.3
smiling 15.2
outdoors 14.3
relaxation 14.2
women 13.4
model t 13
lifestyle 13
cute 12.9
male 12.9
sexy 12.8
computer 12.8
business 12.7
negative 12.5
outside 12
technology 11.9
summer 11.6
outdoor 11.5
lady 11.4
notebook 11.2
one 11.2
model 10.9
job 10.6
cheerful 10.6
travel 10.6
happiness 10.2
casual 10.2
professional 10.1
transportation 9.9
film 9.8
vacation 9.8
newspaper 9.8
water 9.3
tourist 9.3
traveler 9.2
leisure 9.1
fashion 9
worker 9
working 8.8
child 8.8
driver 8.7
hair 8.7
brunette 8.7
boy 8.7
automobile 8.6
life 8.5
transport 8.2
businesswoman 8.2
posing 8
home 8
black 7.8
face 7.8
men 7.7
youth 7.7
auto 7.7
photographic paper 7.6
communication 7.6
world 7.5
joy 7.5
playing 7.3
looking 7.2
sofa 7.1
sky 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.3
person 99.1
outdoor 97.8
clothing 89.5
smile 82.1
black and white 76.4
human face 75
woman 71.8
posing 63

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Female, 58.4%
Confused 76.3%
Calm 11.7%
Happy 4.3%
Sad 3.1%
Disgusted 3%
Surprised 1.1%
Angry 0.3%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a person posing for the camera 92.8%
a man and a woman posing for a photo 54.6%
a person standing posing for the camera 54.5%

Text analysis

Amazon

12388
IT
belief

Google

12388 12388. MAMICA
12388
MAMICA
12388.