Human Generated Data

Title

Untitled (three women making flower bouquets)

Date

c. 1960

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10788

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (three women making flower bouquets)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Room 99.3
Indoors 99.3
Person 98.9
Human 98.9
Person 98.6
Bedroom 89.4
Living Room 89.3
Interior Design 89.3
Plant 87.1
Dressing Room 82.9
Vase 66
Jar 66
Pottery 66
People 65
Flower 62.1
Blossom 62.1
Art 61.9
Bed 61.6
Furniture 61.6
Clothing 60.9
Apparel 60.9
Dorm Room 60.8
Flower Arrangement 58.7
Potted Plant 55
Person 53.5

Imagga
created on 2022-01-15

newspaper 28.9
product 21.1
man 20.2
people 20.1
interior 19.5
home 19.1
room 17.6
window 16.6
creation 16.5
old 16
shop 14.4
male 14.2
house 13.4
person 13
architecture 12.5
vintage 12.4
indoors 12.3
indoor 11.9
adult 11.8
family 11.6
building 11.4
salon 11.3
ancient 11.2
historic 11
mercantile establishment 10.8
antique 10.7
table 10.7
retro 10.7
men 10.3
barbershop 10.2
inside 10.1
groom 10
work 9.9
glass 9.9
fashion 9.8
couple 9.6
love 9.5
chair 9.3
worker 9.1
portrait 9.1
dress 9
religion 9
history 8.9
mother 8.8
decoration 8.7
women 8.7
smiling 8.7
art 8.6
bouquet 8.6
wedding 8.3
happy 8.1
clinic 8
holiday 7.9
happiness 7.8
child 7.8
black 7.8
elegant 7.7
furniture 7.7
historical 7.5
frame 7.5
new 7.3
place of business 7.3
smile 7.1
medical 7.1
modern 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 97.7
indoor 85.9
vase 67.1
old 40.9
family 21.3
cluttered 11

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 89.8%
Calm 96.6%
Disgusted 0.9%
Happy 0.9%
Sad 0.7%
Confused 0.4%
Angry 0.2%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 50-58
Gender Male, 100%
Sad 98.8%
Happy 0.4%
Confused 0.2%
Angry 0.2%
Calm 0.1%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Captions

Microsoft

a group of people sitting at a table 87.2%
a group of people sitting around a table 86.8%
a group of people sitting on a table 79.2%

Text analysis

Amazon

57065-A.
MJ17-YT33

Google

F70
65-A.
M7-YT3
F70 65-A. M7-YT3