Human Generated Data

Title

Untitled (family and dog around Christmas tree)

Date

1958

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7179

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (family and dog around Christmas tree)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1958

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7179

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.7
Human 99.7
Chair 99.3
Furniture 99.3
Person 99.1
Person 98.4
Person 93.6
Sitting 85.1
Indoors 80.5
Room 80.4
Restaurant 77
Table 73.9
People 66.5
Clinic 65.5
Meal 65.4
Food 65.4
Cafeteria 63.7
Person 61.6
Face 60.8
Photography 60.3
Photo 60.3
Cafe 58.6
Dining Table 58.5
Female 55.8

Clarifai
created on 2023-10-25

people 99.5
man 97.7
monochrome 97.6
chair 97.1
furniture 96.8
group 96.5
woman 96.4
indoors 95.2
adult 94.6
sit 94.5
room 94.5
family 92.6
group together 91.5
dining room 90.2
child 86.9
table 85.4
home 85.2
music 83.5
recreation 83
several 78.3

Imagga
created on 2022-01-08

interior 25.6
glass 21.9
table 21.3
room 20.4
chair 17
modern 16.8
salon 16.2
sketch 15.9
indoors 15.8
people 15.6
person 14.9
shop 14.9
restaurant 14.7
drawing 13.5
architecture 12.5
furniture 12.4
light 12
luxury 12
women 11.9
house 11.7
barbershop 11.5
man 11.4
adult 11.3
home 11.2
work 11
hall 10.9
equipment 10.7
design 10.7
medical 10.6
dinner 10.6
urban 10.5
party 10.3
men 10.3
floor 10.2
inside 10.1
science 9.8
business 9.7
group 9.7
dining 9.5
decoration 9.4
lifestyle 9.4
wine 9.4
drink 9.2
mercantile establishment 9.2
indoor 9.1
office 9.1
representation 8.9
style 8.9
working 8.8
black 8.4
window 8.4
instrument 8.4
wood 8.3
glasses 8.3
wedding 8.3
kitchen 8.2
food 8
building 8
decor 7.9
male 7.8
lab 7.8
chemistry 7.7
laboratory 7.7
elegant 7.7
residential 7.7
set 7.6
fashion 7.5
alcohol 7.5
human 7.5
hospital 7.3
wineglass 7.2
celebration 7.2
portrait 7.1
day 7.1

Google
created on 2022-01-08

Black-and-white 85.8
Style 83.9
Chair 80
Art 76.3
Monochrome 73.9
Monochrome photography 73.8
Room 70.4
Event 70.3
Table 67.1
Picture frame 66.3
Font 64.7
Sitting 60.5
Music 57.2
Visual arts 56.8
Illustration 56.7
Still life photography 55.3
Desk 54.6
Musician 52.6
Curtain 50.8

Microsoft
created on 2022-01-08

text 99.5
table 92.2
person 76.3
furniture 72.5
clothing 66.3
man 63
drawing 58.4
old 50.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Female, 76.8%
Calm 87.8%
Happy 6%
Sad 5.2%
Confused 0.5%
Surprised 0.1%
Disgusted 0.1%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 31-41
Gender Male, 98.7%
Happy 70.1%
Calm 15.5%
Sad 6.3%
Confused 2.6%
Disgusted 1.9%
Surprised 1.4%
Angry 1.1%
Fear 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Chair 99.3%

Categories

Imagga

interior objects 72.1%
paintings art 25.5%
food drinks 1.4%

Text analysis

Amazon

13757.
M3--YT3

Google

43757.
43757.