Human Generated Data

Title

Untitled (family portrait outside)

Date

c. 1950

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21579

Human Generated Data

Title

Untitled (family portrait outside)

People

Artist: John Howell, American active 1930s-1960s

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21579

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.7
Human 99.7
Person 99.6
Clothing 99.4
Apparel 99.4
Person 99.3
Person 98.4
Person 98.2
Person 97
Person 96.3
Person 95.1
Face 93.7
Grass 93.7
Plant 93.7
Person 93.1
Female 90.1
People 89.4
Shorts 87.5
Outdoors 84.9
Smile 79.8
Shoe 79.2
Footwear 79.2
Tree 78.8
Kid 77.9
Child 77.9
Vegetation 77.6
Girl 73.2
Pants 72.4
Woman 69.1
Yard 68.1
Nature 68.1
Dress 67.5
Portrait 65.2
Photography 65.2
Photo 65.2
Boy 65
Park 63
Lawn 63
Coat 60.3
Suit 58.4
Overcoat 58.4
Teen 56.4

Clarifai
created on 2023-10-22

people 99.9
child 99
group together 98.8
group 98.5
adult 96.3
man 95.4
canine 94.3
recreation 93.1
woman 92.9
boy 91.5
family 91.2
leader 91.1
several 90.8
administration 89.2
five 88.8
wear 87
veil 86.6
war 86.3
four 85.4
sibling 84.9

Imagga
created on 2022-03-05

kin 37.6
world 31.1
man 30.9
sunset 29.7
child 26.6
silhouette 23.2
people 22.9
male 21.4
beach 21.2
outdoors 19.2
adult 18.8
sky 18.5
outdoor 16.8
water 16
sport 15.9
couple 15.7
person 15.6
summer 15.4
active 14.4
walking 14.2
love 14.2
sea 13.3
happy 12.5
lifestyle 12.3
two 11.9
mother 11.7
ocean 11.6
parent 11.4
sun 11.3
happiness 11
black 10.8
dusk 10.5
sibling 10.4
portrait 10.4
pedestrian 10.3
evening 10.3
leisure 10
park 9.9
family 9.8
together 9.6
boy 9.6
walk 9.5
men 9.5
day 9.4
clouds 9.3
joy 9.2
freedom 9.2
danger 9.1
vacation 9
romantic 8.9
travel 8.5
relationship 8.4
dark 8.4
protection 8.2
dirty 8.1
activity 8.1
sand 8.1
sexy 8
life 8
clothing 8
women 7.9
autumn 7.9
destruction 7.8
toxic 7.8
sunny 7.8
mask 7.7
hand 7.6
free 7.5
fun 7.5
landscape 7.4
holding 7.4
light 7.4
romance 7.1
grass 7.1
businessman 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

outdoor 99.8
clothing 97.5
text 94.5
person 94.3
posing 91.8
man 79.2
smile 79
footwear 71.7
sport 71.6
black and white 65.2
group 63.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Male, 99.7%
Sad 46.2%
Calm 27.6%
Fear 19.6%
Confused 1.8%
Angry 1.7%
Disgusted 1.5%
Surprised 1.1%
Happy 0.6%

AWS Rekognition

Age 50-58
Gender Female, 98.6%
Happy 45.2%
Sad 23.1%
Calm 12.2%
Angry 6.6%
Fear 5.4%
Surprised 4.8%
Disgusted 2.3%
Confused 0.3%

AWS Rekognition

Age 23-33
Gender Female, 55%
Calm 99.1%
Happy 0.5%
Surprised 0.1%
Sad 0.1%
Confused 0.1%
Angry 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 34-42
Gender Male, 62.2%
Calm 98.6%
Happy 0.9%
Surprised 0.1%
Fear 0.1%
Confused 0.1%
Disgusted 0.1%
Angry 0%
Sad 0%

AWS Rekognition

Age 41-49
Gender Female, 72.6%
Confused 70.6%
Sad 7.8%
Calm 6.4%
Happy 6.2%
Disgusted 3.1%
Surprised 2.9%
Angry 2.2%
Fear 0.9%

AWS Rekognition

Age 50-58
Gender Male, 89.9%
Calm 98.9%
Confused 0.3%
Happy 0.2%
Surprised 0.2%
Disgusted 0.1%
Angry 0.1%
Sad 0.1%
Fear 0.1%

AWS Rekognition

Age 23-31
Gender Male, 99.8%
Sad 53.8%
Calm 25.5%
Surprised 10.5%
Disgusted 3.8%
Angry 3.6%
Fear 1.1%
Happy 0.8%
Confused 0.7%

AWS Rekognition

Age 26-36
Gender Male, 87.4%
Calm 53.1%
Surprised 40.4%
Sad 2.1%
Happy 1.2%
Angry 1.2%
Disgusted 0.8%
Confused 0.7%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.7%
Person 99.6%
Person 99.3%
Person 98.4%
Person 98.2%
Person 97%
Person 96.3%
Person 95.1%
Person 93.1%
Shoe 79.2%