Human Generated Data

Title

Untitled (Moore family at dinner table)

Date

c. 1940

People

Artist: C. Bennette Moore, American 1879 - 1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21794

Human Generated Data

Title

Untitled (Moore family at dinner table)

People

Artist: C. Bennette Moore, American 1879 - 1939

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21794

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 99
Human 99
Person 98
Person 97.5
Person 96.8
Room 96.7
Indoors 96.7
Person 96.3
Person 95.2
Person 95
Dining Room 94.9
Person 94.6
Furniture 94.1
Person 93.4
Person 91.2
Restaurant 91
Chair 90.6
Meal 89.9
Food 89.9
People 88.1
Dining Table 81.1
Table 81.1
Person 79.8
Dish 78
Crowd 76.8
Dinner 71.5
Supper 71.5
Person 65.7
Portrait 64.2
Photography 64.2
Face 64.2
Photo 64.2
Text 61
Cafeteria 55.9
Workshop 55.7
Audience 55.4
Person 49.8
Person 44.9

Clarifai
created on 2023-10-22

people 100
group 99.6
many 98.8
adult 98.3
furniture 98.2
group together 98
administration 97.1
man 96.6
woman 96.2
sit 95.5
child 94.9
room 94.6
several 94
leader 94
chair 92.8
home 92.3
dining room 89.1
recreation 88.3
war 88.2
military 87.3

Imagga
created on 2022-03-11

sax 30
old 25.8
building 19.2
shop 19
room 18.8
barbershop 17.3
vintage 16.5
hall 15.5
newspaper 14.7
indoors 14
people 13.9
table 13.8
ancient 13.8
chair 13.8
home 13.5
architecture 13.3
interior 13.3
product 12.3
antique 12.1
mercantile establishment 12.1
man 12.1
indoor 11.9
aged 11.8
person 11.3
wind instrument 11.2
city 10.8
male 10.6
wall 10.3
grunge 10.2
inside 10.1
history 9.8
retro 9.8
couple 9.6
men 9.4
window 9.3
historic 9.2
musical instrument 9.1
creation 8.8
women 8.7
love 8.7
sitting 8.6
glass 8.6
house 8.4
place of business 8.1
classroom 8.1
decoration 8
business 7.9
structure 7.9
art 7.8
restaurant 7.7
historical 7.5
brown 7.4
design 7.3
bowed stringed instrument 7.3
celebration 7.2
night 7.1
travel 7
modern 7

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

table 96.4
text 96
person 91.9
clothing 90.6
man 81.8
furniture 75.2
wedding 73.1
chair 72.4
woman 57.9
old 43.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Female, 66.8%
Calm 99.9%
Happy 0.1%
Confused 0%
Disgusted 0%
Surprised 0%
Sad 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 41-49
Gender Male, 71.1%
Happy 32.8%
Sad 30.9%
Calm 14.2%
Confused 8.1%
Disgusted 4.8%
Surprised 4.4%
Fear 3.2%
Angry 1.5%

AWS Rekognition

Age 52-60
Gender Female, 89.8%
Sad 62.1%
Calm 31.2%
Happy 1.7%
Disgusted 1.6%
Fear 1.2%
Angry 1.1%
Surprised 0.7%
Confused 0.5%

AWS Rekognition

Age 48-56
Gender Female, 90.9%
Happy 57.1%
Fear 25.6%
Calm 12.2%
Surprised 1.9%
Disgusted 1.1%
Confused 1%
Sad 0.6%
Angry 0.5%

AWS Rekognition

Age 35-43
Gender Female, 89.8%
Happy 93.4%
Sad 3.1%
Surprised 1.3%
Disgusted 0.7%
Calm 0.4%
Confused 0.4%
Fear 0.3%
Angry 0.3%

AWS Rekognition

Age 51-59
Gender Male, 95.1%
Happy 84.8%
Calm 6.9%
Confused 2.6%
Surprised 2.1%
Sad 1.8%
Disgusted 0.7%
Fear 0.7%
Angry 0.5%

AWS Rekognition

Age 51-59
Gender Male, 85.6%
Fear 31%
Surprised 19.4%
Happy 17.8%
Disgusted 13%
Calm 7.9%
Sad 6.2%
Angry 2.6%
Confused 2.2%

AWS Rekognition

Age 29-39
Gender Male, 97.9%
Calm 39.5%
Confused 25%
Sad 18.5%
Happy 9.2%
Disgusted 2.5%
Fear 2.3%
Angry 1.6%
Surprised 1.4%

AWS Rekognition

Age 26-36
Gender Male, 95.2%
Sad 79.9%
Happy 15.1%
Calm 2.3%
Fear 1%
Confused 0.6%
Angry 0.6%
Surprised 0.3%
Disgusted 0.2%

AWS Rekognition

Age 27-37
Gender Male, 81%
Calm 50.4%
Sad 41.8%
Confused 5.6%
Disgusted 1.1%
Happy 0.5%
Angry 0.3%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 48-56
Gender Male, 98.3%
Sad 80.3%
Calm 7.7%
Confused 6.6%
Angry 1.5%
Disgusted 1.4%
Fear 1.2%
Happy 0.9%
Surprised 0.5%

AWS Rekognition

Age 18-26
Gender Male, 90.7%
Confused 51.8%
Sad 19.2%
Calm 15.8%
Fear 4.8%
Happy 2.6%
Angry 2.4%
Surprised 1.8%
Disgusted 1.7%

AWS Rekognition

Age 43-51
Gender Female, 65.4%
Sad 43.5%
Calm 38.2%
Confused 5.3%
Fear 3.5%
Happy 3.2%
Angry 2.8%
Disgusted 2%
Surprised 1.5%

AWS Rekognition

Age 30-40
Gender Male, 99.3%
Sad 56.5%
Calm 35.9%
Fear 2.4%
Angry 2.4%
Confused 0.9%
Disgusted 0.7%
Happy 0.6%
Surprised 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Person 99%
Person 98%
Person 97.5%
Person 96.8%
Person 96.3%
Person 95.2%
Person 95%
Person 94.6%
Person 93.4%
Person 91.2%
Person 79.8%
Person 65.7%
Person 49.8%
Person 44.9%
Chair 90.6%

Categories

Text analysis

Amazon

YT33AS
NAGOY