Human Generated Data

Title

Untitled (Ben Stahl and family in living room)

Date

1955

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7873

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Ben Stahl and family in living room)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Indoors 99.6
Room 99.6
Person 99.5
Human 99.5
Furniture 99.2
Chair 99.2
Person 98.6
Person 94.6
Chair 92.5
Chair 90.4
Chair 89.2
Person 87.3
Dressing Room 82.6
Restaurant 73.7
Dining Room 68.5
Living Room 66
Person 61.6
Waiting Room 59.8
People 58.9
Cafeteria 57.8
Sitting 57.3

Imagga
created on 2022-01-09

room 66.2
table 55.6
interior 54.8
chair 47.6
furniture 45.7
restaurant 32.5
home 29.1
modern 28.7
house 26.7
floor 26
decor 23.9
design 23.6
dining 22.8
glass 22.3
inside 22.1
wood 21.7
dinner 21.5
luxury 20.6
seat 20.5
window 19.8
indoors 19.3
contemporary 18.8
style 17.8
hall 17.6
architecture 17.2
office 16.7
classroom 16.7
indoor 16.4
kitchen 16.4
chairs 15.7
comfortable 15.3
decoration 14.5
hotel 14.3
drink 14.2
counter 14.1
light 14
tables 13.8
food 13.5
wall 13.1
building 13
cafeteria 12.8
business 12.8
lamp 12.4
lunch 12.3
empty 12.3
elegant 12
meal 11.8
nobody 11.7
residential 11.5
decorate 11.4
toilet 11.2
dining table 11.1
service 11.1
bar 11.1
eat 10.9
stylish 10.9
setting 10.6
banquet 10.5
living 10.4
tile 10.4
party 10.3
structure 10.3
stool 10.1
elegance 10.1
3d 10.1
drawer 9.9
plant 9.7
sofa 9.7
area 9.7
apartment 9.6
estate 9.5
people 9.5
relaxation 9.2
cabinets 8.9
catering 8.8
urban 8.7
desk 8.7
lifestyle 8.7
lights 8.3
salon 8.3
cook 8.2
residence 8.2
sink 8.1
bathroom 7.9
upscale 7.9
day 7.8
reception 7.8
real 7.6
meeting 7.5
place 7.5
coffee 7.4
event 7.4
door 7.4
wine 7.4
hospital 7.3
wooden 7
furnishing 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Female, 66.7%
Calm 46%
Sad 44.4%
Surprised 3.5%
Confused 2.5%
Happy 1.1%
Angry 1%
Fear 0.9%
Disgusted 0.6%

AWS Rekognition

Age 26-36
Gender Female, 67.3%
Happy 92.2%
Calm 5.2%
Sad 1.3%
Surprised 0.3%
Confused 0.3%
Fear 0.3%
Disgusted 0.2%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Chair 99.2%

Captions

Microsoft

a group of people sitting at a table in front of a window 48.2%
a group of people sitting at a table 48.1%
a person sitting at a table in front of a window 48%

Text analysis

Amazon

415
415 71.
71.
KODAK-A-EITW

Google

25h
--YT
3A2
--
AQo
25hヨ--YTヨ3A2--AQo