Human Generated Data

Title

Untitled (men and women in living room)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17151

Human Generated Data

Title

Untitled (men and women in living room)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17151

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Furniture 99.8
Person 99.5
Human 99.5
Person 99.3
Person 99.2
Person 98.3
Room 96
Indoors 96
Person 90.5
Person 90.5
Crib 84.9
Classroom 57.4
School 57.4
Person 45.4

Clarifai
created on 2023-10-29

people 99.9
group 99.6
woman 97.4
adult 96.8
group together 95.6
many 94.6
child 94.4
man 93.9
family 88.9
several 87.2
music 87
education 86.6
wear 83.3
monochrome 82.6
room 81.9
school 81.9
dancing 78.1
boy 76.6
administration 76.1
leader 74.7

Imagga
created on 2022-02-26

brass 47.2
wind instrument 37.4
people 30.7
person 29.3
musical instrument 28
male 26.2
man 25.5
adult 22.6
room 21.8
men 21.5
group 20.9
teacher 20
nurse 19.9
businessman 19.4
business 18.8
professional 18.4
women 18.2
happy 15
classroom 13.5
team 13.4
family 13.3
portrait 12.9
sport 12.9
home 12.7
silhouette 12.4
interior 12.4
smiling 12.3
motion 12
casual 11.9
educator 11.7
indoors 11.4
couple 11.3
human 11.2
active 11.2
lifestyle 10.8
life 10.7
crowd 10.6
boy 10.4
happiness 10.2
girls 10
cornet 10
exercise 10
dance 9.9
suit 9.9
activity 9.8
urban 9.6
corporate 9.4
businesswoman 9.1
black 9
fun 9
success 8.8
together 8.8
full length 8.7
standing 8.7
meeting 8.5
trombone 8.4
holding 8.2
new 8.1
office 8
work 7.8
smile 7.8
mother 7.8
attractive 7.7
move 7.7
two 7.6
chair 7.6
walking 7.6
city 7.5
teamwork 7.4
executive 7.4
light 7.3
teenager 7.3
children 7.3
board 7.2
copy space 7.2
job 7.1
modern 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

person 96.5
text 96.3
clothing 96.3
drawing 50.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 79.9%
Calm 64.1%
Confused 13.4%
Sad 8.1%
Happy 5.3%
Surprised 3.1%
Angry 2.6%
Disgusted 2.3%
Fear 1.1%

AWS Rekognition

Age 48-56
Gender Male, 99.6%
Sad 37.3%
Happy 23.5%
Surprised 12.2%
Calm 10.1%
Angry 7.2%
Fear 5.8%
Confused 2.6%
Disgusted 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.5%
Person 99.3%
Person 99.2%
Person 98.3%
Person 90.5%
Person 90.5%
Person 45.4%

Categories

Text analysis

Amazon

34
134