Human Generated Data

Title

Untitled (Commencement, Mather House, Harvard University)

Date

June 1972

People

Artist: Elsa Dorfman, American 1937 - 2020

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, P2000.68

Copyright

© Elsa Dorfman

Human Generated Data

Title

Untitled (Commencement, Mather House, Harvard University)

People

Artist: Elsa Dorfman, American 1937 - 2020

Date

June 1972

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the artist, P2000.68

Copyright

© Elsa Dorfman

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.4
Human 99.4
Person 99.2
Person 99.1
Person 99
Person 99
Person 98.9
Restaurant 98.2
Furniture 96.6
Person 94.9
Dining Table 94.6
Table 94.6
Chair 94.1
Person 92.9
Person 92.3
Person 91
Room 88.8
Indoors 88.8
Meal 85.3
Food 85.3
Food Court 82
Cafeteria 78.4
Sitting 76.7
People 72
Couch 65.9
Dish 65.6
Chair 64.4
Person 64.1
Suit 59.3
Clothing 59.3
Coat 59.3
Apparel 59.3
Overcoat 59.3
Dining Room 59
Undershirt 58.6
Cafe 56.3

Clarifai
created on 2023-10-25

people 100
adult 99.3
group 99
group together 98.9
many 98
furniture 98
man 97
woman 96.5
administration 95.2
chair 94.5
several 94.1
leader 93.7
war 89.9
military 89.8
recreation 89.2
wear 89
vehicle 87.3
child 85.1
five 84.7
sit 80.5

Imagga
created on 2022-01-09

man 34.3
room 31.1
people 30.6
table 29.7
male 27.6
office 27.6
person 25.6
indoors 20.2
meeting 19.8
interior 19.4
business 19.4
desk 19.3
businessman 18.5
restaurant 17.9
working 17.7
home 17.5
happy 17.5
chair 17.5
sitting 17.2
indoor 16.4
men 16.3
computer 16.1
smiling 15.9
work 15.9
classroom 15.5
businesswoman 15.4
corporate 14.6
adult 14.6
group 14.5
team 14.3
senior 14
together 14
laptop 13.9
worker 13.5
smile 13.5
job 13.3
professional 13.2
executive 12.9
couple 12.2
teamwork 12
old 11.8
lifestyle 11.6
talking 11.4
communication 10.9
house 10.9
modern 10.5
engineer 10.5
businesspeople 10.4
coffee 10.2
suit 9.9
success 9.6
service 9.2
inside 9.2
drink 9.2
family 8.9
looking 8.8
teacher 8.8
women 8.7
education 8.6
building 8.6
architecture 8.6
glass 8.6
student 8.5
contemporary 8.5
learning 8.4
mature 8.4
occupation 8.2
patient 8.2
food 8.1
conference 7.8
portrait 7.8
party 7.7
class 7.7
hotel 7.6
workplace 7.6
furniture 7.6
hand 7.6
presentation 7.4
design 7.3
decor 7.1

Google
created on 2022-01-09

Table 94.7
Chair 84.7
Suit 77.7
Snapshot 74.3
Event 71.7
Recreation 69.7
Monochrome photography 69.4
Monochrome 68.9
Umbrella 68.3
Room 66.1
Vintage clothing 66
Tablecloth 65.3
History 63.3
Sitting 58.8
Art 57.9
Classic 57.6
Meal 54

Microsoft
created on 2022-01-09

table 99.2
person 98.6
text 97.7
clothing 90.8
furniture 90
people 84.4
man 83.6
chair 82.2
group 74.7
restaurant 22.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 41-49
Gender Male, 99.9%
Calm 79.2%
Sad 14.8%
Disgusted 3%
Confused 1.3%
Angry 0.9%
Surprised 0.3%
Happy 0.3%
Fear 0.2%

AWS Rekognition

Age 51-59
Gender Male, 97.5%
Calm 98.9%
Angry 0.4%
Confused 0.3%
Sad 0.1%
Disgusted 0.1%
Happy 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 52-60
Gender Female, 97.4%
Happy 92.3%
Disgusted 5.3%
Sad 0.6%
Calm 0.6%
Surprised 0.4%
Angry 0.3%
Fear 0.3%
Confused 0.2%

AWS Rekognition

Age 54-62
Gender Male, 71.1%
Calm 65.9%
Happy 13.5%
Angry 7.7%
Surprised 4.9%
Sad 2.8%
Confused 2.5%
Disgusted 1.4%
Fear 1.2%

AWS Rekognition

Age 9-17
Gender Female, 89.4%
Calm 94.9%
Sad 4%
Angry 0.4%
Surprised 0.3%
Confused 0.2%
Disgusted 0.1%
Happy 0.1%
Fear 0.1%

AWS Rekognition

Age 23-31
Gender Male, 85%
Calm 80.9%
Sad 14.8%
Surprised 1%
Disgusted 0.9%
Confused 0.9%
Angry 0.8%
Fear 0.4%
Happy 0.3%

AWS Rekognition

Age 9-17
Gender Male, 64.6%
Disgusted 29.6%
Sad 25.4%
Calm 22.1%
Surprised 7.3%
Angry 7.3%
Happy 6.5%
Fear 1.2%
Confused 0.8%

AWS Rekognition

Age 23-31
Gender Male, 68.3%
Calm 98.8%
Surprised 0.3%
Angry 0.3%
Fear 0.2%
Sad 0.2%
Happy 0.1%
Confused 0.1%
Disgusted 0.1%

Microsoft Cognitive Services

Age 27
Gender Male

Microsoft Cognitive Services

Age 68
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Chair 94.1%