Human Generated Data

Title

Untitled (female graduates planting tree)

Date

c. 1950

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19080

Human Generated Data

Title

Untitled (female graduates planting tree)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19080

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.9
Human 98.9
Person 97.5
Person 97
Person 96.2
Person 95.2
Clothing 93.5
Apparel 93.5
Person 92.1
Person 88.8
Person 86.8
Funeral 70.7
People 70.1
Portrait 64
Face 64
Photography 64
Photo 64
Flower Bouquet 62.4
Flower 62.4
Blossom 62.4
Plant 62.4
Flower Arrangement 62.4
Symbol 57.4
Robe 56.1
Fashion 56.1
Crowd 55.6
Priest 55.4
Person 49.1

Clarifai
created on 2023-10-22

people 99.9
adult 98.5
group 98.4
group together 98
man 97.9
woman 97.4
wear 97.3
leader 97
gown (clothing) 96.2
many 96
wedding 95
several 93.4
veil 92.7
religion 92.5
child 89.6
outfit 89.5
administration 87.8
ceremony 87.4
two 86.9
military 85.9

Imagga
created on 2022-03-05

cemetery 29.1
picket fence 25.4
fence 22
person 18.3
clothing 17.2
lab coat 16.7
barrier 16.3
coat 16.2
old 15.3
people 15.1
man 13.4
adult 13.2
park 12.3
garment 11.5
male 11.3
landscape 11.1
church 11.1
dress 10.8
outside 10.3
obstruction 10.2
world 10.2
tree 10.1
groom 10
religion 9.9
travel 9.9
trees 9.8
bride 9.6
couple 9.6
scene 9.5
men 9.4
day 9.4
wedding 9.2
house 9.2
road 9
new 8.9
building 8.9
home 8.8
forest 8.7
love 8.7
gown 8.6
religious 8.4
portrait 8.4
traditional 8.3
outdoors 8.2
happy 8.1
family 8
fan 8
smiling 8
faith 7.7
garden 7.5
clothes 7.5
vintage 7.4
mother 7.4
room 7.4
danger 7.3
fall 7.2
color 7.2
history 7.2
grass 7.1
rural 7
architecture 7
together 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

outdoor 95.2
white 88.5
black and white 85.6
text 81.9
black 72.9
old 69.6
night 68.2
clothing 66
person 57.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Male, 99.9%
Calm 99.2%
Happy 0.6%
Surprised 0.1%
Disgusted 0.1%
Confused 0%
Angry 0%
Fear 0%
Sad 0%

AWS Rekognition

Age 31-41
Gender Male, 99.7%
Calm 93.9%
Happy 3.8%
Surprised 0.8%
Sad 0.5%
Disgusted 0.4%
Angry 0.3%
Confused 0.2%
Fear 0.1%

AWS Rekognition

Age 24-34
Gender Male, 69.5%
Sad 43.4%
Calm 34.7%
Disgusted 6.8%
Happy 6.7%
Surprised 3.7%
Confused 1.9%
Angry 1.8%
Fear 0.9%

AWS Rekognition

Age 23-33
Gender Female, 80%
Calm 99.9%
Surprised 0%
Disgusted 0%
Happy 0%
Sad 0%
Confused 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 22-30
Gender Male, 97.2%
Calm 99.7%
Sad 0.1%
Fear 0%
Confused 0%
Surprised 0%
Disgusted 0%
Happy 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.9%
Person 97.5%
Person 97%
Person 96.2%
Person 95.2%
Person 92.1%
Person 88.8%
Person 86.8%
Person 49.1%

Captions

Microsoft
created on 2022-03-05

an old photo of a man 74.1%
old photo of a man 70.6%
a old photo of a man 68.6%

Text analysis

Amazon

...
YТ3А-

Google

YT37A°2-NAGOX
YT37A°2-NAGOX