Human Generated Data

Title

Untitled (female graduates planting tree)

Date

c. 1950

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19078

Human Generated Data

Title

Untitled (female graduates planting tree)

People

Artist: Robert Burian, American active 1940s-1950s

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19078

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.8
Apparel 99.8
Person 98.8
Human 98.8
Person 97.8
Person 97.3
Person 96.9
Person 96.5
Person 93.5
Robe 92
Fashion 92
Wedding 89.3
Gown 88.9
Person 88.4
Person 85.9
Person 84
Wedding Gown 74.4
People 68.8
Photography 67.3
Photo 67.3
Portrait 64.1
Face 64.1
Bride 63.5
Plant 63.1
Dress 59.9
Crowd 58.8
Flower 55.9
Blossom 55.9

Clarifai
created on 2023-10-22

people 99.9
adult 98.7
group 98.3
leader 98.1
wear 97.6
man 97.6
gown (clothing) 97.1
veil 96.4
group together 95.9
many 94.7
wedding 94.4
administration 94
religion 92.2
several 92
ceremony 91.6
woman 90.9
priest 90.3
clergy 89.4
military 88.8
two 88

Imagga
created on 2022-03-05

white 26.5
cemetery 25.8
person 21.5
clothing 16.7
lab coat 15.7
landscape 15.6
garment 14
outdoor 13.7
coat 13.5
scene 12.1
people 11.7
travel 11.3
old 11.1
picket fence 10.3
man 10.1
water 10
park 9.9
farm 9.8
mountain 9.8
bathrobe 9.3
tree 9.2
adult 9.1
tourism 9.1
fence 9
sky 8.9
building 8.8
men 8.6
outside 8.5
male 8.5
clothes 8.4
summer 8.4
stone 8.3
environment 8.2
outdoors 8.2
rural 7.9
robe 7.9
grass 7.9
black 7.8
architecture 7.8
fan 7.8
sunrise 7.5
seller 7.3
group 7.2
tourist 7.2
color 7.2
sunset 7.2
religion 7.2
history 7.1
businessman 7.1
day 7.1
gown 7.1
country 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

tree 99.5
outdoor 97.2
text 96.7
white 76.4
clothing 73.2
black and white 71.5
old 57

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 76.3%
Sad 96.2%
Happy 2.9%
Calm 0.3%
Surprised 0.2%
Disgusted 0.1%
Angry 0.1%
Fear 0.1%
Confused 0.1%

AWS Rekognition

Age 24-34
Gender Male, 95.9%
Calm 97.7%
Sad 0.7%
Happy 0.6%
Surprised 0.4%
Disgusted 0.3%
Confused 0.2%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 42-50
Gender Male, 96.8%
Happy 68.5%
Calm 23.4%
Surprised 4.7%
Sad 1%
Disgusted 0.9%
Fear 0.6%
Confused 0.5%
Angry 0.4%

AWS Rekognition

Age 49-57
Gender Male, 87.7%
Surprised 48%
Calm 41.7%
Sad 3.6%
Happy 2.7%
Confused 2.5%
Disgusted 0.6%
Fear 0.5%
Angry 0.4%

AWS Rekognition

Age 37-45
Gender Female, 56.4%
Sad 41.1%
Happy 20.1%
Calm 19.2%
Disgusted 6.1%
Fear 5.4%
Surprised 3.4%
Angry 2.8%
Confused 1.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.8%
Person 97.8%
Person 97.3%
Person 96.9%
Person 96.5%
Person 93.5%
Person 88.4%
Person 85.9%
Person 84%

Captions

Microsoft
created on 2022-03-05

an old photo of a man 65.1%
old photo of a man 61.4%
a old photo of a man 59%

Text analysis

Amazon

JOO