Human Generated Data

Title

Untitled (twelve members of family lined up from tallest to shortest looking to right in living room)

Date

1949

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9229

Human Generated Data

Title

Untitled (twelve members of family lined up from tallest to shortest looking to right in living room)

People

Artist: Martin Schweig, American 20th century

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9229

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.4
Human 99.4
Person 99.1
Person 99
Clothing 98.9
Apparel 98.9
Person 98.4
Person 98.3
Person 97.7
Person 97
Person 95.1
Shorts 91.3
People 90.9
Overcoat 88.8
Coat 88.8
Suit 88.8
Person 87.5
Head 70.7
Indoors 67.4
Family 65.7
Photography 64.9
Portrait 64.9
Face 64.9
Photo 64.9
Dress 63.6
Sailor Suit 63.5
Floor 62.3
Female 61.7
Shirt 61.6
Kid 60.2
Child 60.2
Flooring 58.5
Room 58.1
Person 53.2
Person 50.5

Clarifai
created on 2023-10-27

people 99.8
group 99.3
group together 99
education 96
man 95.7
woman 95.5
school 95.1
adult 94.1
many 90.3
child 88.7
teacher 87.2
several 83.5
boy 81.4
leader 75.4
five 75
actor 74.6
recreation 74.4
monochrome 74.3
squad 73.6
adolescent 73.5

Imagga
created on 2022-01-23

brass 79.9
wind instrument 60.4
cornet 55
musical instrument 41.6
person 31.5
people 30.6
man 24.8
male 24.8
group 19.3
adult 19.2
silhouette 19
human 18.7
black 16.8
men 16.3
nurse 15.9
player 12.4
portrait 12.3
business 12.1
team 11.6
sport 10.9
student 10.9
happy 10.6
medical 10.6
teacher 10.5
boy 10.4
room 9.9
businessman 9.7
crowd 9.6
education 9.5
women 9.5
symbol 9.4
youth 9.4
child 9.3
event 9.2
fashion 9
school 9
family 8.9
science 8.9
professional 8.6
design 8.4
friendship 8.4
blackboard 8.4
health 8.3
occupation 8.2
light 8
classroom 7.9
planner 7.9
couple 7.8
drawing 7.8
sitting 7.7
skill 7.7
studying 7.7
exam 7.7
active 7.5
dark 7.5
holding 7.4
girls 7.3
pose 7.2
board 7.2
stylish 7.2
sexy 7.2
golfer 7.2
interior 7.1
happiness 7
medicine 7
modern 7
together 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 96.1
text 95.8
clothing 92.8
group 79.7
posing 70.4
man 68.2
old 42.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 20-28
Gender Male, 87.2%
Happy 64.8%
Calm 25.6%
Sad 2.9%
Surprised 2.4%
Fear 1.8%
Angry 1%
Confused 0.8%
Disgusted 0.8%

AWS Rekognition

Age 23-31
Gender Female, 73.1%
Calm 99.4%
Surprised 0.6%
Sad 0%
Angry 0%
Disgusted 0%
Confused 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 43-51
Gender Male, 100%
Calm 36.1%
Confused 25.7%
Sad 15%
Happy 8.9%
Surprised 8.3%
Disgusted 4.1%
Angry 1.2%
Fear 0.8%

AWS Rekognition

Age 36-44
Gender Male, 99.9%
Sad 99.7%
Calm 0.2%
Angry 0.1%
Confused 0%
Disgusted 0%
Fear 0%
Happy 0%
Surprised 0%

AWS Rekognition

Age 37-45
Gender Female, 63.4%
Sad 80.6%
Calm 14%
Happy 2%
Confused 1.1%
Surprised 0.8%
Angry 0.6%
Disgusted 0.5%
Fear 0.4%

AWS Rekognition

Age 23-31
Gender Male, 99.9%
Calm 95.3%
Sad 2.3%
Surprised 0.9%
Confused 0.5%
Angry 0.5%
Disgusted 0.2%
Happy 0.2%
Fear 0.1%

AWS Rekognition

Age 20-28
Gender Female, 92.4%
Calm 93.9%
Sad 4.6%
Disgusted 0.7%
Surprised 0.2%
Confused 0.2%
Angry 0.2%
Happy 0.1%
Fear 0.1%

AWS Rekognition

Age 26-36
Gender Female, 54.1%
Calm 97.6%
Sad 1.6%
Happy 0.4%
Surprised 0.1%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 21-29
Gender Female, 79.5%
Calm 99.8%
Sad 0.1%
Confused 0%
Disgusted 0%
Happy 0%
Fear 0%
Angry 0%
Surprised 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Text analysis

Amazon

3
st 3 83 .
st
.
83
M
KUDAK-AE

Google

a
VT
A2
A
3 8 a 8 VTヨヨA2-A
3
8
ヨヨ
-