Human Generated Data

Title

Untitled (woman seated in lawn chair next to pool)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10698

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman seated in lawn chair next to pool)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Furniture 100
Chair 92.3
Person 89.7
Human 89.7
Clothing 87.3
Apparel 87.3
Potted Plant 86.8
Jar 86.8
Plant 86.8
Vase 86.8
Pottery 86.8
Shorts 86.4
Sitting 72.9
Outdoors 69.1
Tree 65.1
Table 64.9
Portrait 64.1
Face 64.1
Photography 64.1
Photo 64.1
Monitor 63.3
Screen 63.3
Electronics 63.3
Display 63.3
Female 59.8
Planter 58.7
LCD Screen 58.7
Swimwear 58.5
Building 58.5
Water 57
Urban 56.9
Flooring 56.4

Imagga
created on 2022-01-15

piano 100
grand piano 100
stringed instrument 76.5
keyboard instrument 75.2
percussion instrument 74.1
musical instrument 52.8
chair 42.9
seat 23.3
business 16.4
luxury 13.7
computer 13.6
people 13.4
modern 13.3
architecture 13.3
technology 12.6
furniture 12.6
interior 12.4
laptop 11.8
person 11.2
work 11
relaxation 10.9
lifestyle 10.8
man 10.7
travel 10.6
sky 10.2
relax 10.1
holiday 10
room 10
building 9.9
vacation 9.8
indoors 9.7
office 9.6
black 9.6
outdoor 9.2
window 9.2
relaxing 9.1
summer 9
working 8.8
chairs 8.8
lounge 8.8
urban 8.7
scene 8.6
device 8.5
beach 8.4
indoor 8.2
outdoors 8.2
transportation 8.1
sun 8
rocking chair 8
water 8
home 8
day 7.8
happiness 7.8
male 7.8
table 7.4
support 7.3
color 7.2
medical 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 100
black and white 73.5
black 69.3
aircraft 27.1

Face analysis

Amazon

Google

AWS Rekognition

Age 42-50
Gender Female, 98.9%
Happy 93.9%
Calm 4.5%
Angry 0.7%
Confused 0.3%
Surprised 0.3%
Sad 0.1%
Disgusted 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair 92.3%
Person 89.7%

Captions

Microsoft

a person sitting in front of a store 28.1%

Text analysis

Amazon

35325
all
KODAKSEIA

Google

33325 YT37A°2-XAO
33325
YT37A°2-XAO