Human Generated Data

Title

Untitled (seated woman crocheting)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7754

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (seated woman crocheting)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Chair 99.3
Furniture 99.3
Fish 98.5
Animal 98.5
Apparel 97.2
Clothing 97.2
Human 96.4
Person 96.4
Sitting 80.9
Face 80.3
Female 70.6
Photography 67.3
Portrait 67.3
Photo 67.3
Door 64.3
Dress 60.6
Wood 57.6
Woman 56.2

Imagga
created on 2022-01-09

person 39.3
grandma 35
home 34.3
people 30.7
senior 28.1
adult 27.7
computer 27.5
happy 25.1
laptop 24.4
man 24.2
male 23.5
smiling 23.2
newspaper 22.9
mature 22.3
indoors 21.1
lifestyle 20.2
working 19.5
elderly 19.2
sitting 18.9
business 16.4
retirement 16.3
casual 16.1
looking 16
product 15.8
office 15.6
old 15.3
work 15.2
room 15
smile 15
clothing 14.6
retired 14.5
child 14.3
worker 13.8
grandmother 13.7
portrait 13.6
happiness 13.3
notebook 12.8
technology 12.6
family 12.5
desk 12.4
creation 12.3
one 12
pretty 11.9
attractive 11.9
indoor 11.9
together 11.4
cheerful 11.4
face 11.4
couple 11.3
professional 11.1
women 11.1
alone 11
house 10.9
kid 10.6
interior 10.6
lady 10.6
mother 10.1
20s 10.1
aged 10
grandfather 9.9
human 9.8
job 9.7
standing 9.6
men 9.4
learning 9.4
suit 9.4
domestic 9.3
phone 9.2
occupation 9.2
executive 8.8
corporate 8.6
reading 8.6
keyboard 8.5
communication 8.4
student 8.4
book 8.4
inside 8.3
holding 8.3
teenager 8.2
children 8.2
cute 7.9
love 7.9
table 7.9
living room 7.8
education 7.8
older 7.8
couch 7.7
modern 7.7
youth 7.7
wireless 7.6
talking 7.6
garment 7.6
scholar 7.4
teen 7.4
device 7.3
businesswoman 7.3
color 7.2
kitchen 7.2
pensioner 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.6
person 98.1
human face 95.3
black and white 94.7
clothing 90.4
smile 76.1
white 70.5

Face analysis

Amazon

Google

AWS Rekognition

Age 54-62
Gender Female, 89.8%
Happy 59.3%
Confused 10.4%
Calm 9.2%
Surprised 7.2%
Sad 4.8%
Fear 3.6%
Disgusted 3.2%
Angry 2.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Fish 98.5%
Person 96.4%

Captions

Microsoft

an old photo of a person 78.9%
old photo of a person 74.8%
an old photo of a person 74.7%

Text analysis

Amazon

24546-A

Google

24546-A. 24546-A
24546-A
24546-A.