Human Generated Data

Title

Untitled (family gathered in living room next to Christmas tree)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3528

Human Generated Data

Title

Untitled (family gathered in living room next to Christmas tree)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3528

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.7
Human 99.7
Person 99.6
Person 99.1
Person 99.1
Clothing 99
Apparel 99
Tree 98.7
Plant 98.7
Person 97.2
Person 96.2
Shorts 95.1
Person 92.2
Footwear 87.2
Shoe 87.2
Ornament 79
Chair 72.4
Furniture 72.4
Suit 71.9
Overcoat 71.9
Coat 71.9
Dress 69.6
Female 67.3
People 66.2
Kid 61.7
Child 61.7
Crowd 59.7
Girl 59.1
Christmas Tree 58.2

Clarifai
created on 2023-10-26

people 99.9
group 99.1
group together 98.2
woman 97.2
adult 96.4
man 95.8
leader 95.3
many 94.2
child 94.2
administration 92
several 88.7
recreation 86.9
musician 82.5
music 81
boy 80.1
five 79.9
three 79.6
wear 77.8
portrait 76.5
education 74.8

Imagga
created on 2022-01-22

man 39
people 28.5
person 28.4
male 28
couple 21.8
happiness 20.4
happy 18.8
child 18.8
beach 18.5
adult 18.2
love 18.1
family 16.9
boy 16.5
life 15.5
summer 15.4
businessman 15
fun 15
silhouette 14.9
world 14.7
father 14.6
outdoors 14.2
sky 14
men 13.7
lifestyle 13.7
teacher 13.6
sunset 13.5
dad 13.4
business 13.4
black 13.2
sea 12.5
smiling 12.3
group 12.1
joy 11.7
together 11.4
sport 11.4
water 11.3
student 11.2
two 11
groom 10.7
outdoor 10.7
blackboard 10.6
success 10.5
standing 10.4
walking 10.4
women 10.3
day 10.2
leisure 10
active 9.9
holding 9.9
vacation 9.8
portrait 9.7
kin 9.3
school 9.2
park 9.1
human 9
cheerful 8.9
sand 8.7
room 8.7
mother 8.7
bride 8.6
holiday 8.6
professional 8.6
parent 8.5
friends 8.5
relationship 8.4
friendship 8.4
relaxation 8.4
ocean 8.3
dress 8.1
team 8.1
sun 8.1
romance 8
looking 8
smile 7.8
education 7.8
lab coat 7.7
walk 7.6
finance 7.6
hand 7.6
wife 7.6
sign 7.5
freedom 7.3
new 7.3
kid 7.1
to 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 96
person 90.7
clothing 89.7
black and white 88.3
man 71.1
footwear 56.4
woman 52.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Male, 96.6%
Calm 91.2%
Sad 6.5%
Confused 0.5%
Happy 0.5%
Angry 0.5%
Disgusted 0.4%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 53-61
Gender Male, 99.9%
Calm 95.3%
Fear 0.9%
Sad 0.9%
Happy 0.8%
Surprised 0.8%
Angry 0.7%
Disgusted 0.3%
Confused 0.3%

AWS Rekognition

Age 52-60
Gender Male, 93.7%
Calm 96.7%
Sad 0.9%
Surprised 0.7%
Confused 0.6%
Fear 0.4%
Happy 0.3%
Disgusted 0.2%
Angry 0.2%

AWS Rekognition

Age 28-38
Gender Female, 95%
Calm 93.7%
Happy 3.2%
Sad 1.1%
Confused 0.5%
Surprised 0.4%
Angry 0.4%
Disgusted 0.4%
Fear 0.3%

AWS Rekognition

Age 48-54
Gender Male, 99.2%
Happy 78.7%
Calm 8.9%
Sad 7%
Surprised 1.9%
Confused 1.6%
Disgusted 0.9%
Angry 0.7%
Fear 0.3%

AWS Rekognition

Age 38-46
Gender Male, 99.6%
Sad 69.5%
Calm 22.3%
Angry 2.4%
Happy 2%
Disgusted 1.4%
Confused 1.3%
Surprised 0.7%
Fear 0.5%

AWS Rekognition

Age 35-43
Gender Male, 99.4%
Calm 27.4%
Angry 26.7%
Happy 11.2%
Fear 9.9%
Sad 7.7%
Confused 7.5%
Disgusted 5.3%
Surprised 4.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 87.2%

Categories

Text analysis

Amazon

KODAK-EVEA

Google

YT3RA2-XAGOX Drorore
YT3RA2-XAGOX
Drorore