Human Generated Data

Title

Untitled (woman with two Great Danes, Pennsylvania)

Date

1939, printed later

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.309

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman with two Great Danes, Pennsylvania)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1939, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.309

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Dog 97.4
Pet 97.4
Canine 97.4
Mammal 97.4
Animal 97.4
Furniture 97.2
Chair 96.7
Person 96.7
Human 96.7
Room 93.8
Indoors 93.8
Chair 81.7
Living Room 77
Chair 76.5
Interior Design 68.6
Flooring 63.4
Bedroom 59
Couch 58.5
Table 58.1
Monitor 58.1
Electronics 58.1
Display 58.1
Screen 58.1
Desk 57.6
Girl 55.5
Female 55.5

Clarifai
created on 2023-10-25

dog 100
canine 99.9
people 99.8
two 98.1
group together 97.5
furniture 97.4
boxer 96.4
group 96.1
man 94.3
adult 94.3
monochrome 93.1
three 93
one 92.6
mammal 91.5
owner 90.3
four 89.3
pet 86.9
chair 85.9
room 83.9
portrait 83.4

Imagga
created on 2022-01-08

room 70.6
chair 56.2
interior 55.7
table 48.1
restaurant 43.2
furniture 43.2
classroom 34.1
cafeteria 30
kitchen 29.6
floor 28.8
modern 28.7
seat 27.4
house 26.7
design 25.3
dining 24.7
wood 24.2
indoors 23.7
empty 23.2
home 23.1
inside 23
decor 23
chairs 22.5
building 21.6
contemporary 18.8
indoor 18.3
tables 17.7
structure 17.6
glass 17.1
style 17.1
dinner 15.3
comfortable 15.3
window 14.7
luxury 14.6
architecture 14.1
drink 13.4
food 12.9
stool 12.8
wall 12.8
light 12.7
elegance 12.6
hall 12.6
apartment 12.4
nobody 12.4
decoration 12.3
stove 12.1
office 11.9
folding chair 11.6
residential 11.5
plant 11.2
relaxation 10.9
counter 10.9
stylish 10.9
lifestyle 10.8
wooden 10.5
urban 10.5
lamp 10.5
lunch 10.3
place 10.2
bar 10.2
cook 10.1
eat 10.1
refrigerator 9.9
area 9.8
oven 9.8
hotel 9.5
life 9.3
event 9.2
desk 9
cabinets 8.9
teacher 8.8
setting 8.7
scene 8.7
decorate 8.6
3d 8.5
coffee 8.3
service 8.3
meal 8.2
drawer 7.9
device 7.9
cabinet 7.9
work 7.8
diner 7.8
tile 7.8
party 7.7
living 7.6
learning 7.5
row 7.4
school 7.4
business 7.3
people 7.2
domestic 7.2
working 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 97.7
dog 92.9
man 91.9
indoor 88.4
animal 88.3
furniture 85.4
chair 80.6
table 74.3
person 66
clothing 56.8
dining table 6.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 47-53
Gender Female, 100%
Calm 95%
Happy 1.6%
Angry 1.3%
Sad 0.7%
Surprised 0.6%
Confused 0.4%
Disgusted 0.3%
Fear 0.3%

Microsoft Cognitive Services

Age 48
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Dog 97.4%
Chair 96.7%
Person 96.7%

Captions