Human Generated Data

Title

Untitled (the Lawton family)

Date

1910s

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2174

Human Generated Data

Title

Untitled (the Lawton family)

People

Artist: Unidentified Artist,

Date

1910s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2174

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.5
Human 99.5
Person 98.8
Person 97.9
Person 97.8
People 97.6
Person 94.6
Person 94.2
Family 94.1
Person 91.3
Person 91.3
Person 86.4

Clarifai
created on 2023-10-15

people 100
group 99.7
portrait 99
adult 98.9
woman 97.3
man 97.3
offspring 96.1
child 95.2
three 95.2
print 94.7
administration 93.8
leader 92.8
wear 92.5
art 92.3
son 92.2
two 91.8
family 89.7
actor 89.5
street 88.1
vintage 87.3

Imagga
created on 2021-12-15

kin 100
people 21.2
man 19.5
male 17
person 16.2
old 16
dress 15.3
portrait 14.9
couple 14.8
business 13.3
statue 13.3
face 12.8
money 12.7
currency 12.5
religion 12.5
dollar 12.1
art 11.9
finance 11.8
bride 11.6
vintage 11.6
ancient 11.2
cash 11
symbol 10.8
groom 10.8
catholic 10.7
family 10.7
sculpture 10.6
sign 10.5
success 10.4
monument 10.3
love 10.2
adult 10.2
bank 9.8
fashion 9.8
bill 9.5
closeup 9.4
paper 9.4
culture 9.4
happy 9.4
religious 9.4
wedding 9.2
room 9.2
black 9
group 8.9
antique 8.8
happiness 8.6
architecture 8.6
exchange 8.6
card 8.5
church 8.3
banking 8.3
retro 8.2
aged 8.1
lady 8.1
financial 8
interior 8
businessman 7.9
hundred 7.7
men 7.7
holy 7.7
wall 7.7
hand 7.6
stone 7.6
head 7.6
bouquet 7.5
traditional 7.5
savings 7.4
design 7.3
detail 7.2
decoration 7.2
hair 7.1
women 7.1

Google
created on 2021-12-15

Dress 87
Vintage clothing 73.7
Event 67.6
Room 67.2
Art 66.8
History 66.2
Classic 64.4
Stock photography 64.2
Visual arts 59.3
Collection 56.1
Retro style 53.2

Microsoft
created on 2021-12-15

text 99.6
clothing 99
person 98.7
woman 91.3
posing 91.1
smile 89.7
human face 75.4
old 67.2
dress 65.5
vintage clothing 63.1
photograph 58.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 15-27
Gender Female, 60.6%
Calm 98.7%
Angry 0.5%
Happy 0.4%
Sad 0.3%
Surprised 0.1%
Fear 0.1%
Disgusted 0%
Confused 0%

AWS Rekognition

Age 22-34
Gender Female, 60.4%
Calm 89.6%
Sad 6.2%
Angry 1.8%
Happy 0.7%
Fear 0.6%
Disgusted 0.4%
Confused 0.3%
Surprised 0.3%

AWS Rekognition

Age 23-35
Gender Male, 97.9%
Calm 98.8%
Sad 0.6%
Angry 0.2%
Fear 0.2%
Happy 0.1%
Confused 0.1%
Surprised 0%
Disgusted 0%

AWS Rekognition

Age 47-65
Gender Male, 62.5%
Calm 96.3%
Sad 2.4%
Angry 0.5%
Confused 0.3%
Surprised 0.2%
Fear 0.2%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 22-34
Gender Male, 96.2%
Calm 83.6%
Happy 10.8%
Surprised 2.1%
Sad 1.6%
Confused 1.1%
Angry 0.4%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 45-63
Gender Female, 67.4%
Happy 50.6%
Calm 43.8%
Angry 1.5%
Surprised 1.4%
Confused 1.3%
Sad 0.8%
Disgusted 0.5%
Fear 0.1%

AWS Rekognition

Age 23-37
Gender Female, 61.1%
Calm 98.1%
Sad 0.6%
Angry 0.4%
Happy 0.4%
Surprised 0.2%
Confused 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 2-8
Gender Female, 72.3%
Calm 80.8%
Sad 5.6%
Angry 5%
Fear 3%
Happy 1.9%
Surprised 1.6%
Disgusted 1.2%
Confused 0.9%

AWS Rekognition

Age 1-5
Gender Female, 56.4%
Calm 90%
Surprised 3.1%
Fear 2.6%
Confused 1.9%
Sad 1.2%
Happy 0.7%
Disgusted 0.2%
Angry 0.2%

Microsoft Cognitive Services

Age 5
Gender Female

Microsoft Cognitive Services

Age 57
Gender Male

Microsoft Cognitive Services

Age 18
Gender Female

Microsoft Cognitive Services

Age 67
Gender Female

Microsoft Cognitive Services

Age 37
Gender Male

Microsoft Cognitive Services

Age 31
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Categories

Imagga

people portraits 94.2%
paintings art 5.1%