Human Generated Data

Title

Untitled (woman and three children around Christmas tree)

Date

c. 1943

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7365

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman and three children around Christmas tree)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1943

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Chair 99.9
Furniture 99.9
Human 98.6
Person 98.6
Person 95.1
Interior Design 94.6
Indoors 94.6
Room 84.8
Person 81.1
Chair 79.9
Person 79.4
Living Room 77.4
Shop 71.7
People 65.5
Person 61.1
Person 60.9
Lamp 60.1
Clinic 56.5
Person 54

Imagga
created on 2022-01-08

dishwasher 77.7
white goods 64.4
home appliance 50.2
appliance 39.5
interior 35.3
modern 30.1
house 26.7
home 25.5
architecture 24.2
furniture 23.5
design 22.5
room 21.6
kitchen 19.4
table 19.2
glass 19.2
luxury 18.8
indoors 17.6
apartment 17.2
3d 17
decor 16.8
durables 16.5
window 15.4
wall 15.4
business 15.2
floor 14.9
light 14.7
equipment 14
residential 13.4
building 13.1
inside 12.9
construction 12.8
clean 12.5
sketch 11.8
steel 11.5
shop 11.3
structure 10.9
decoration 10.8
domestic 10.8
refrigerator 10.6
drawing 10.6
chair 10.6
urban 10.5
plan 10.4
technology 10.4
contemporary 10.3
elegance 10.1
work 10.1
oven 9.9
science 9.8
metal 9.6
development 9.5
sink 9.1
new 8.9
office 8.8
medical 8.8
lifestyle 8.7
empty 8.6
lamp 8.6
research 8.6
industry 8.5
wood 8.3
city 8.3
indoor 8.2
style 8.2
case 8.1
water 8
working 7.9
people 7.8
render 7.8
stainless 7.7
architect 7.7
laboratory 7.7
project 7.7
comfortable 7.6
engineering 7.6
dining 7.6
estate 7.6
biology 7.6
lifestyles 7.6
food 7.4
shopping 7.3
machine 7.2
cabinet 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 98.1
house 75.3

Feature analysis

Amazon

Chair 99.9%
Person 98.6%
Lamp 60.1%

Captions

Microsoft

a person standing in front of a window 36.9%
a person standing next to a window 28.1%
an old photo of a person 28%

Text analysis

Amazon

ISI
19073
ELO61

Google

013. MAGON-YT3RA2-NAMT2A
013.
MAGON-YT3RA2-NAMT2A