Human Generated Data

Title

Untitled (men cutting cake, American Association of Nurserymen)

Date

c.1945

People

Artist: Harris & Ewing, American 1910s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22294

Human Generated Data

Title

Untitled (men cutting cake, American Association of Nurserymen)

People

Artist: Harris & Ewing, American 1910s-1940s

Date

c.1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22294

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Meal 99.8
Food 99.8
Person 98
Human 98
Person 98
Person 97.5
Clothing 97.4
Apparel 97.4
Person 97.3
Person 97
Person 96.8
Restaurant 96.4
Person 95.6
Dish 93.2
Person 91.3
Person 90.5
Shirt 88.6
Home Decor 88.2
Cafeteria 87.5
Waiter 85.1
Buffet 84.3
Person 73.2
People 71
Linen 66.7
Culinary 65.6
Person 52.9

Clarifai
created on 2023-10-22

people 99.8
group 99.2
adult 98.7
leader 98.6
group together 98.1
administration 97.3
man 96.9
many 95.7
military 94.2
several 91.4
woman 91.2
war 89.9
wear 87.9
five 84.8
furniture 84.2
sit 83.7
uniform 76.8
portrait 76.6
facial expression 74.7
four 72.7

Imagga
created on 2022-03-11

patient 70.4
person 59.2
nurse 57.8
case 48.3
sick person 48.2
man 41.7
male 31.2
people 29
adult 25.2
hospital 21.4
coat 20.8
medical 20.3
lab coat 19.8
business 19.4
smiling 18.1
men 18
businessman 17.7
indoors 17.6
doctor 16.9
home 16.7
specialist 16.7
40s 16.6
clothing 16.5
30s 16.4
couple 15.7
happy 15.7
health 15.3
businesspeople 15.2
colleagues 14.6
talking 14.3
room 13.8
smile 13.5
day 13.3
together 13.1
sitting 12.9
20s 12.8
businesswoman 12.7
medicine 12.3
mature 12.1
office 12
casual 11.9
forties 11.8
portrait 11.6
team 11.6
professional 11.6
worker 11.6
family 11.6
adults 11.4
senior 11.2
clothes 11.2
color 11.1
women 11.1
uniform 10.9
face 10.7
table 10.4
meeting 10.4
work 10.2
surgeon 10
bright 10
four people 9.9
attractive 9.8
thirties 9.7
working 9.7
laboratory 9.6
two 9.3
care 9.1
business people 8.9
computer 8.8
discussion 8.8
lab 8.7
elderly 8.6
twenties 8.6
garment 8.5
camera 8.3
confident 8.2
cheerful 8.1
clinic 8
looking 8
to 8
lifestyle 7.9
discussing 7.9
happiness 7.8
emotions 7.8
corporate 7.7
daytime 7.7
four 7.7
exam 7.7
serious 7.6
hand 7.6
bed 7.6
desk 7.6
teamwork 7.4
occupation 7.3
group 7.3
black 7.2
handsome 7.1
father 7.1

Google
created on 2022-03-11

Shirt 94.1
Coat 86.6
Style 83.8
Black-and-white 82.2
Hat 81.6
Cooking 75.8
Monochrome photography 75.3
Event 73.3
Vintage clothing 72.9
Crew 72.7
Monochrome 72.1
Table 69.8
Team 65
History 64.1
Room 63.2
Uniform 62.9
Stock photography 62.8
Formal wear 60.6
Family 52.8
Photographic paper 50.6

Microsoft
created on 2022-03-11

text 95.2
clothing 94.9
person 93.1
man 92.8
funeral 73.4
table 72.7
food 69.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 99.8%
Calm 90.6%
Happy 3.2%
Sad 1.7%
Disgusted 1.6%
Angry 1.4%
Surprised 0.9%
Confused 0.4%
Fear 0.3%

AWS Rekognition

Age 40-48
Gender Female, 77.3%
Calm 99.8%
Confused 0.1%
Surprised 0%
Happy 0%
Disgusted 0%
Sad 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 40-48
Gender Male, 77.1%
Sad 57.5%
Calm 33.8%
Happy 2%
Fear 1.9%
Confused 1.8%
Surprised 1.7%
Disgusted 0.8%
Angry 0.5%

AWS Rekognition

Age 43-51
Gender Male, 96.7%
Sad 65.9%
Calm 27.2%
Disgusted 1.8%
Angry 1.7%
Fear 1.6%
Surprised 1%
Confused 0.4%
Happy 0.4%

AWS Rekognition

Age 29-39
Gender Female, 96.5%
Happy 87.2%
Calm 8.1%
Surprised 1.8%
Disgusted 1.3%
Sad 0.7%
Fear 0.4%
Confused 0.3%
Angry 0.3%

AWS Rekognition

Age 38-46
Gender Female, 94.3%
Calm 70%
Happy 24.5%
Disgusted 1.6%
Sad 1.3%
Surprised 0.8%
Confused 0.8%
Fear 0.7%
Angry 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98%
Person 98%
Person 97.5%
Person 97.3%
Person 97%
Person 96.8%
Person 95.6%
Person 91.3%
Person 90.5%
Person 73.2%
Person 52.9%

Categories

Imagga

people portraits 98.9%