Human Generated Data

Title

Untitled (children with dog and large wreath in front of house)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16379

Human Generated Data

Title

Untitled (children with dog and large wreath in front of house)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16379

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Person 99.4
Human 99.4
Person 99.3
Clothing 98.7
Apparel 98.7
Person 97.5
Shelter 96.8
Countryside 96.8
Rural 96.8
Outdoors 96.8
Nature 96.8
Building 96.8
Dress 90.2
Female 89
Face 88.9
Shoe 84.5
Footwear 84.5
Grass 80
Plant 80
Person 79.1
Shorts 78.7
Shoe 75
Costume 74
Suit 71
Coat 71
Overcoat 71
Woman 69.2
Photography 68.4
Photo 68.4
Portrait 67.9
Girl 67.8
Tree 67.5
People 66.7
Brick 66
Robe 61.9
Fashion 61.9
Housing 61
Floor 60.7
Door 60
Gown 57.3
Man 57
Play 56.6
Bridegroom 56.6
Wedding 56.6

Clarifai
created on 2023-10-28

people 99.9
bride 98.5
wedding 98.5
veil 97.2
group 96.7
two 96.7
woman 95.9
adult 94.7
couple 94.5
actor 94.4
man 93.7
monochrome 93.6
groom 93.1
family 92.4
music 92.4
three 92.3
child 91.6
wear 90.1
actress 89.7
street 89.5

Imagga
created on 2022-02-11

picket fence 33.7
fence 26.3
people 25.1
man 22.9
male 21.3
barrier 20.6
person 18.3
business 18.2
adult 17.1
businessman 16.8
men 15.5
portrait 14.2
obstruction 13.7
clothing 13.2
architecture 12.6
structure 12.5
travel 12
women 11.9
old 11.8
suit 11.1
day 11
dress 10.8
window 10.8
couple 10.5
historical 10.4
building 10.3
corporate 10.3
groom 10.1
history 9.8
family 9.8
human 9.7
statue 9.6
happiness 9.4
monument 9.3
two 9.3
traditional 9.1
city 9.1
professional 9.1
holding 9.1
group 8.9
life 8.9
job 8.8
looking 8.8
art 8.7
ancient 8.6
culture 8.5
black 8.4
color 8.3
sculpture 8.3
historic 8.2
tourism 8.2
park 8.2
marble 7.9
world 7.9
crowd 7.7
walking 7.6
clothes 7.5
occupation 7.3
worker 7.3
smiling 7.2
work 7.2
religion 7.2
team 7.2
mask 7.2
romantic 7.1
face 7.1
bride 7.1
together 7

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

text 93.9
window 93.7
posing 87.8
standing 84.2
wedding dress 83.4
clothing 73.5
black and white 67.3
bride 66.9
person 55.6
old 44.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Female, 68.8%
Calm 43%
Happy 41.7%
Surprised 4.1%
Confused 4%
Fear 2.7%
Sad 2.5%
Angry 1.1%
Disgusted 0.8%

AWS Rekognition

Age 31-41
Gender Male, 99.1%
Calm 82.9%
Sad 8.3%
Happy 4.6%
Disgusted 1.3%
Confused 1.1%
Angry 0.7%
Fear 0.6%
Surprised 0.5%

AWS Rekognition

Age 30-40
Gender Female, 52.1%
Happy 95.4%
Fear 1.4%
Sad 1%
Surprised 1%
Calm 0.5%
Angry 0.3%
Confused 0.3%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.4%
Person 99.3%
Person 97.5%
Person 79.1%
Shoe 84.5%
Shoe 75%

Categories

Imagga

interior objects 53%
paintings art 45.3%

Text analysis

Amazon

6
KODAK-2-E1W

Google

MJI7-- Y T37A°2 -- - NAGOX
MJI7--
Y
T37A°2
--
-
NAGOX