Human Generated Data

Title

Untitled (family eating at dining room table)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16879

Human Generated Data

Title

Untitled (family eating at dining room table)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16879

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.3
Human 99.3
Room 98.7
Indoors 98.7
Person 98.7
Person 98.6
Person 96.3
Person 96.1
Interior Design 94.9
Person 92.2
Tie 89.9
Accessories 89.9
Accessory 89.9
Person 86.9
Person 86.3
Furniture 80.9
Sunglasses 79.6
Person 77.5
Bedroom 71.2
Living Room 67.3
Dining Table 66.2
Table 66.2
People 63.9
Text 62.4
Meal 60.3
Food 60.3
Restaurant 57.2
Classroom 56.9
School 56.9
Person 56.2

Clarifai
created on 2023-10-29

people 99.9
group 98.6
monochrome 98.4
child 97.8
group together 97.7
woman 97.6
man 96.6
adult 96.5
many 94.1
administration 93.7
leader 91.8
furniture 91.7
family 90.9
chair 90.3
several 88.8
music 88.3
room 87.9
indoors 87
sit 86.2
recreation 86.1

Imagga
created on 2022-02-26

man 29.5
people 26.8
room 26.7
person 22.9
male 22.2
classroom 21
blackboard 20.7
men 16.3
teacher 15
adult 15
old 13.9
happy 13.8
business 13.4
businessman 13.2
couple 13.1
smiling 13
barbershop 12.6
black 12
cheerful 11.4
group 11.3
senior 11.2
home 11.2
sitting 11.2
shop 11.1
women 11.1
family 10.7
board 9.9
vintage 9.9
school 9.9
hand 9.9
human 9.7
indoors 9.7
looking 9.6
education 9.5
hairdresser 9.2
office 9.1
portrait 9.1
team 9
table 8.7
meeting 8.5
friends 8.4
brass 8.4
building 8.3
indoor 8.2
mercantile establishment 8.1
idea 8
holiday 7.9
day 7.8
happiness 7.8
standing 7.8
hands 7.8
class 7.7
life 7.7
student 7.6
house 7.5
musical instrument 7.3
aged 7.2
lifestyle 7.2
executive 7.2
wind instrument 7.2
to 7.1
work 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

person 96.5
indoor 90.3
text 90
window 84.1
clothing 74.1
group 65.5
black and white 53.7
table 52.7
crowd 0.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 93.6%
Calm 90.4%
Surprised 4.1%
Confused 2.9%
Sad 0.9%
Angry 0.5%
Fear 0.5%
Disgusted 0.5%
Happy 0.3%

AWS Rekognition

Age 31-41
Gender Male, 98.5%
Calm 90.5%
Sad 6.7%
Confused 0.8%
Surprised 0.5%
Angry 0.5%
Happy 0.4%
Disgusted 0.4%
Fear 0.1%

AWS Rekognition

Age 45-51
Gender Male, 99.9%
Happy 79.6%
Calm 12.6%
Sad 2.9%
Confused 2.1%
Disgusted 1.5%
Surprised 0.7%
Angry 0.4%
Fear 0.2%

AWS Rekognition

Age 37-45
Gender Male, 97.1%
Calm 97.6%
Confused 0.9%
Surprised 0.7%
Disgusted 0.3%
Sad 0.2%
Angry 0.1%
Fear 0.1%
Happy 0%

AWS Rekognition

Age 41-49
Gender Female, 64.1%
Calm 94.3%
Happy 4.5%
Confused 0.3%
Surprised 0.2%
Disgusted 0.2%
Sad 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 23-31
Gender Male, 96.9%
Sad 76.5%
Calm 11.8%
Confused 4.4%
Angry 2.7%
Surprised 1.6%
Disgusted 1.3%
Happy 1.1%
Fear 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Tie
Sunglasses
Person 99.3%
Person 98.7%
Person 98.6%
Person 96.3%
Person 96.1%
Person 92.2%
Person 86.9%
Person 86.3%
Person 77.5%
Person 56.2%
Tie 89.9%
Sunglasses 79.6%

Categories

Text analysis

Amazon

KK