Human Generated Data

Title

Untitled (children with birthday hats seated around table)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4884

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (children with birthday hats seated around table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Furniture 99.9
Chair 98.6
Person 98.6
Human 98.6
Person 95.7
Person 95.7
Chair 92.6
Person 89.5
Person 86.9
Apparel 86
Helmet 86
Clothing 86
Person 84.1
Chair 78.4
People 77.9
Table 77.7
Dining Table 75.9
Room 74.9
Indoors 74.9
Suit 71.5
Coat 71.5
Overcoat 71.5
Crowd 71
Person 68.6
Meal 62.6
Food 62.6
Photography 60.8
Photo 60.8
Person 60.4
Plant 60.4
Yard 59.9
Nature 59.9
Outdoors 59.9
Kid 55.6
Child 55.6
Person 44

Imagga
created on 2022-01-23

brass 55.9
wind instrument 42.1
chair 40.4
musical instrument 28
seat 21.7
furniture 17.1
work 16.9
rocking chair 15.6
wicker 15.1
man 14.8
people 12.3
outdoors 11.9
table 11.7
sky 11.5
baritone 11.5
room 11.1
house 10.8
sitting 10.3
outside 10.3
person 10.2
chairs 9.8
interior 9.7
business 9.7
classroom 9.5
building 9.4
day 9.4
product 9.3
relaxation 9.2
modern 9.1
folding chair 8.8
structure 8.5
travel 8.4
black 8.4
outdoor 8.4
summer 8.3
event 8.3
human 8.2
water 8
working 7.9
architecture 7.8
luxury 7.7
floor 7.4
design 7.3
relaxing 7.3
sun 7.2
lifestyle 7.2
holiday 7.2
male 7.1

Google
created on 2022-01-23

Table 94.9
Furniture 94.6
Chair 92.8
Black 89.5
Black-and-white 85.8
Style 83.8
Suit 79.8
Adaptation 79.3
Monochrome 74.3
Monochrome photography 72.5
Room 69.8
Event 65.8
History 64.4
Classic 61.5
Sitting 60.5
Dining room 57.5
Building 56
Art 53.7
Recreation 52.5
Illustration 51.8

Microsoft
created on 2022-01-23

text 97
chair 93.3
table 91.9
house 70.3
black and white 68.6
furniture 19.1

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Male, 86.4%
Surprised 90.5%
Happy 5.5%
Fear 2.6%
Calm 0.7%
Disgusted 0.2%
Confused 0.2%
Sad 0.2%
Angry 0.1%

AWS Rekognition

Age 29-39
Gender Male, 72.4%
Happy 91.7%
Sad 2.1%
Surprised 2%
Angry 1.2%
Calm 1%
Disgusted 0.9%
Confused 0.8%
Fear 0.3%

AWS Rekognition

Age 33-41
Gender Female, 93.1%
Sad 91.4%
Calm 2.6%
Surprised 2.3%
Happy 1.3%
Angry 0.7%
Disgusted 0.6%
Fear 0.6%
Confused 0.4%

AWS Rekognition

Age 33-41
Gender Female, 99.5%
Calm 92.3%
Happy 2.8%
Sad 2%
Surprised 1.5%
Fear 0.7%
Angry 0.2%
Disgusted 0.2%
Confused 0.1%

AWS Rekognition

Age 21-29
Gender Female, 99.8%
Happy 71.7%
Confused 5.9%
Calm 5.8%
Fear 4.8%
Surprised 4.7%
Sad 3.3%
Angry 2.6%
Disgusted 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair 98.6%
Person 98.6%
Helmet 86%

Captions

Microsoft

a group of people sitting in a chair 74.4%
a group of people sitting on a chair 70.9%
a group of people sitting at a table 65.2%

Text analysis

Amazon

18403
16403.

Google

1640
1640