Human Generated Data

Title

Untitled (family portrait, sitting on fence outside)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16880

Human Generated Data

Title

Untitled (family portrait, sitting on fence outside)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16880

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.9
Human 99.9
Person 99.8
Person 99.7
Person 99.4
Person 99
Clothing 98.9
Apparel 98.9
Shoe 98
Footwear 98
Shorts 91.5
Person 88.2
People 88.2
Tie 87.7
Accessories 87.7
Accessory 87.7
Military Uniform 86.8
Military 86.8
Dog 85.6
Mammal 85.6
Animal 85.6
Canine 85.6
Pet 85.6
Soldier 77.7
Dress 76.7
Female 74.9
Army 74.8
Armored 74.8
Face 70.4
Brick 68.1
Girl 62.3
Shoe 60.3
Outdoors 60.1
Costume 59.1
Photography 55.5
Photo 55.5

Clarifai
created on 2023-10-29

people 99.9
group together 99.6
group 98.9
child 98.6
adult 98.5
many 96.3
several 95.7
man 95.2
three 92.1
woman 91.5
recreation 90.9
wear 89.8
military 89.4
adolescent 88.8
boy 88.7
war 86.8
four 85.1
sports equipment 84
leader 84
outfit 82.8

Imagga
created on 2022-02-26

kin 57.1
people 22.3
man 22.2
portrait 16.8
person 16.1
male 15
adult 14.9
attractive 14.7
happy 13.2
lifestyle 13
child 12.8
sport 12.6
lady 12.2
fashion 12.1
sexy 12
beach 12
love 11.8
couple 11.3
pretty 11.2
wall 11.1
summer 10.9
model 10.9
sunset 10.8
silhouette 10.8
outdoor 10.7
outdoors 10.5
clothing 10.1
dress 9.9
sand 9.7
sitting 9.4
water 9.3
holding 9.1
smiling 8.7
swimsuit 8.7
cute 8.6
outside 8.6
two 8.5
black 8.4
world 8.3
leisure 8.3
sky 8.3
vacation 8.2
danger 8.2
art 8
life 8
parent 7.9
together 7.9
happiness 7.8
standing 7.8
old 7.7
newspaper 7.3
work 7.3
sun 7.2
looking 7.2
body 7.2
recreation 7.2
women 7.1
grass 7.1
posing 7.1
mother 7.1
seat 7.1
autumn 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

clothing 96.1
outdoor 94.7
text 90.5
man 89.7
person 88.6
footwear 84.4
posing 80.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Male, 57%
Calm 99.2%
Happy 0.3%
Sad 0.2%
Disgusted 0.1%
Surprised 0.1%
Angry 0.1%
Confused 0%
Fear 0%

AWS Rekognition

Age 22-30
Gender Female, 84.2%
Calm 97.2%
Sad 1.2%
Happy 0.7%
Surprised 0.3%
Confused 0.3%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 33-41
Gender Female, 98.3%
Happy 82.3%
Calm 14.9%
Surprised 1%
Confused 0.5%
Disgusted 0.5%
Angry 0.3%
Sad 0.3%
Fear 0.2%

AWS Rekognition

Age 38-46
Gender Female, 98.9%
Happy 99%
Sad 0.7%
Calm 0.1%
Disgusted 0.1%
Fear 0.1%
Angry 0%
Surprised 0%
Confused 0%

AWS Rekognition

Age 11-19
Gender Female, 51.1%
Calm 65.1%
Happy 31.4%
Sad 1.1%
Disgusted 0.8%
Fear 0.5%
Confused 0.4%
Angry 0.4%
Surprised 0.3%

AWS Rekognition

Age 42-50
Gender Female, 50.5%
Happy 95.4%
Calm 2.9%
Surprised 0.8%
Confused 0.2%
Disgusted 0.2%
Sad 0.2%
Angry 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Tie
Dog
Person 99.9%
Person 99.8%
Person 99.7%
Person 99.4%
Person 99%
Person 88.2%
Shoe 98%
Shoe 60.3%
Tie 87.7%
Dog 85.6%

Categories

Text analysis

Amazon

7
KODAK-A-ITW

Google

MJI7-- YT3RA°2--XAGOX
MJI7--
YT3RA°2--XAGOX