Human Generated Data

Title

Untitled (twelve members of family posed in line from tallest to shortest in living room)

Date

1949

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9228

Human Generated Data

Title

Untitled (twelve members of family posed in line from tallest to shortest in living room)

People

Artist: Martin Schweig, American 20th century

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9228

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.6
Human 99.6
Person 99.6
Person 99.3
Person 99.3
Person 99.3
Clothing 98.9
Apparel 98.9
Person 98.4
People 96
Person 95.5
Person 94.5
Person 88.9
Family 87.9
Person 87.3
Face 86.8
Female 86.2
Overcoat 81.1
Coat 81.1
Suit 81.1
Dress 80.6
Person 79.4
Helmet 76.7
Person 76
Shorts 69.6
Kid 67.8
Child 67.8
Woman 67.6
Photography 67.1
Photo 67.1
Portrait 66.9
Sailor Suit 66.3
Girl 65.6
Indoors 65.4
Shoe 62.8
Footwear 62.8
Man 61.6
Poster 60.6
Advertisement 60.6
Collage 59.7
Shirt 56.7
Baby 56.6
Floor 56.5
Costume 56.3
Pants 55.7
Room 55.6
Door 55.5

Clarifai
created on 2023-10-27

people 99.7
group together 98.9
group 98.5
man 95.5
woman 93.7
adult 93.1
many 91.5
education 91
several 85.3
school 84.6
monochrome 84.4
child 83.5
adolescent 77
actor 76.5
leader 75.6
teacher 74.1
boy 70.4
athlete 69.8
recreation 67.9
five 67.9

Imagga
created on 2022-01-23

brass 68.8
cornet 58.7
wind instrument 53.7
musical instrument 37.8
person 33.7
people 32.3
man 25.5
men 23.2
male 22
player 21.5
adult 21
golfer 18.8
group 18.5
businessman 16.8
black 16.2
sport 15.9
business 15.8
human 15.7
happy 15.7
professional 15.3
silhouette 14.9
portrait 14.9
contestant 12.9
exercise 12.7
couple 12.2
youth 11.1
joy 10.8
family 10.7
active 10.3
smile 10
fun 9.7
crowd 9.6
boy 9.6
standing 9.6
lifestyle 9.4
health 9
team 9
activity 8.9
success 8.8
together 8.8
women 8.7
happiness 8.6
party 8.6
friendship 8.4
company 8.4
dark 8.3
sky 8.3
style 8.2
suit 8.1
symbol 8.1
handsome 8
body 8
uniform 7.9
design 7.9
work 7.8
child 7.8
attractive 7.7
room 7.7
casual 7.6
hand 7.6
fashion 7.5
student 7.5
nurse 7.4
event 7.4
training 7.4
light 7.3
art 7.3
smiling 7.2
fitness 7.2
worker 7.1
job 7.1
medical 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 99.7
posing 93.9
clothing 92.2
text 91.8
standing 81.8
footwear 81.5
group 81
man 63.7
old 46.1
female 28.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Male, 81.1%
Calm 97.9%
Sad 0.7%
Surprised 0.6%
Disgusted 0.3%
Confused 0.2%
Fear 0.1%
Angry 0.1%
Happy 0.1%

AWS Rekognition

Age 30-40
Gender Male, 99.4%
Calm 69.9%
Sad 26.9%
Confused 1.1%
Angry 0.8%
Surprised 0.5%
Disgusted 0.4%
Fear 0.2%
Happy 0.2%

AWS Rekognition

Age 18-26
Gender Female, 90.5%
Sad 40.5%
Calm 24.2%
Happy 15.7%
Angry 5.8%
Fear 4.1%
Disgusted 4%
Surprised 4%
Confused 1.8%

AWS Rekognition

Age 29-39
Gender Male, 63.8%
Happy 53.4%
Sad 23.7%
Calm 13.9%
Fear 4.1%
Confused 1.9%
Surprised 1.4%
Angry 0.8%
Disgusted 0.7%

AWS Rekognition

Age 38-46
Gender Male, 99.9%
Sad 40.2%
Happy 27.3%
Calm 24.4%
Angry 2.4%
Surprised 1.7%
Confused 1.6%
Disgusted 1.3%
Fear 1.1%

AWS Rekognition

Age 22-30
Gender Male, 99.8%
Calm 87.9%
Sad 10.7%
Surprised 0.5%
Confused 0.2%
Happy 0.2%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 11-19
Gender Male, 99.5%
Calm 71%
Surprised 26.6%
Sad 1%
Angry 0.4%
Confused 0.4%
Disgusted 0.2%
Happy 0.2%
Fear 0.1%

AWS Rekognition

Age 27-37
Gender Male, 100%
Calm 65.9%
Sad 12.7%
Confused 7.3%
Surprised 3.8%
Angry 3.8%
Happy 3.5%
Disgusted 2%
Fear 1%

AWS Rekognition

Age 29-39
Gender Female, 52.3%
Calm 50.9%
Happy 41.6%
Surprised 4.7%
Angry 1.2%
Disgusted 0.7%
Sad 0.6%
Fear 0.2%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Helmet 76.7%
Shoe 62.8%

Text analysis

Amazon

8
st
st نو 8
نو
KUDAK-SVEELA

Google

38 a 3 ヨヨA2-XAno
38
a
3
ヨヨ
A2
-
XAno