Human Generated Data

Title

Untitled (woman and three boys holding hands outside)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17563

Human Generated Data

Title

Untitled (woman and three boys holding hands outside)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17563

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.8
Human 99.8
Person 99.4
Person 98.9
Clothing 98.1
Apparel 98.1
Person 97
Shorts 87.8
Female 83.1
Outdoors 77.9
Nature 75
Face 73.8
People 72.1
Plant 68.3
Suit 65
Coat 65
Overcoat 65
Woman 63.1
Grass 59.7
Architecture 59.2
Building 59.2
Wall 57
Dress 56.8
Tree 55.9

Clarifai
created on 2023-10-29

people 99.9
adult 97.1
two 96.4
woman 95.2
man 94
one 91.9
child 90.9
leader 90.8
monochrome 90.5
wear 88.8
group together 87.4
wedding 86.6
group 86.1
art 84.3
street 84.3
furniture 80.9
home 80.6
veil 79.8
family 78.3
three 78.2

Imagga
created on 2022-02-26

fountain 34.3
old 25.8
structure 25.2
stone 23
crutch 22.9
architecture 22
ancient 19.9
staff 17.8
building 17.8
park 17.3
religion 17
history 16.1
tourism 15.7
sculpture 15.5
statue 15.5
tree 14.2
city 14.1
travel 14.1
monument 14
cemetery 13.9
arch 13.8
stick 13.4
art 13.1
antique 12.1
street 12
historic 11.9
people 11.2
church 11.1
grunge 11.1
marble 10.8
god 10.5
man 10.1
catholic 10
landmark 9.9
historical 9.4
culture 9.4
religious 9.4
light 9.4
water 9.3
autumn 8.8
facade 8.6
wall 8.6
walking 8.5
tourist 8.5
famous 8.4
dark 8.4
landscape 8.2
cathedral 8.1
love 7.9
roman 7.8
temple 7.8
scene 7.8
sepia 7.8
outdoor 7.6
memorial 7.5
stucco 7.5
vintage 7.4
style 7.4
inside 7.4
dirty 7.2
column 7.1
life 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

black and white 96
outdoor 90.8
clothing 90.6
person 89.6
text 86.5
footwear 81.8
monochrome 68.1
drawing 62.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Female, 56.8%
Happy 56.9%
Calm 29%
Surprised 7.8%
Disgusted 1.8%
Confused 1.2%
Sad 1.2%
Fear 1.2%
Angry 0.9%

AWS Rekognition

Age 35-43
Gender Female, 97.7%
Calm 99.6%
Disgusted 0.1%
Angry 0.1%
Sad 0.1%
Surprised 0.1%
Confused 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 30-40
Gender Male, 50.5%
Calm 90.2%
Happy 4.6%
Sad 2%
Disgusted 1.3%
Angry 0.7%
Surprised 0.6%
Confused 0.5%
Fear 0.2%

AWS Rekognition

Age 37-45
Gender Male, 81.3%
Calm 97.4%
Surprised 1.3%
Happy 0.7%
Sad 0.3%
Confused 0.2%
Disgusted 0.1%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person
Person 99.8%
Person 99.4%
Person 98.9%
Person 97%

Categories

Text analysis

Amazon

SEI
KACOX