Human Generated Data

Title

Untitled (portrait of family standing on front porch)

Date

c. 1945

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3401

Human Generated Data

Title

Untitled (portrait of family standing on front porch)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3401

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.7
Human 99.7
Person 99.3
Person 99.1
Person 99
Person 99
Person 98.8
Person 97.1
Person 96
Person 95.8
Person 95.3
Clothing 94.8
Apparel 94.8
People 86.9
Door 83.7
Person 77.5
Person 75.4
Family 55.2

Clarifai
created on 2023-10-26

people 99.9
group 99.5
child 98.8
family 97
group together 96.8
woman 96.2
education 95.2
adult 94.8
man 93.7
boy 93.3
many 91.9
son 91.2
school 90.5
three 88.6
administration 88.3
four 86.8
leader 86.6
room 86.2
sibling 85.2
several 84.7

Imagga
created on 2022-01-22

kin 48.6
man 32.2
people 31.2
male 26.3
person 21.8
women 20.5
window 20.3
couple 18.3
business 18.2
adult 18.2
office 18
businessman 16.8
silhouette 16.5
men 16.3
family 15.1
happy 15
portrait 14.9
room 14.1
modern 14
together 14
sitting 13.7
building 13.3
home 12.8
human 12.7
love 12.6
group 12.1
two 11.9
life 11.7
indoors 11.4
indoor 11
professional 10.7
boy 10.4
barbershop 10.4
musical instrument 10
job 9.7
husband 9.5
corporate 9.4
happiness 9.4
lifestyle 9.4
youth 9.4
black 9.1
shop 9
urban 8.7
smiling 8.7
wife 8.5
meeting 8.5
child 8.5
senior 8.4
outdoor 8.4
mother 8.4
executive 8.3
light 8
interior 8
working 7.9
holiday 7.9
work 7.8
standing 7.8
architecture 7.8
boss 7.7
loving 7.6
walk 7.6
career 7.6
worker 7.6
togetherness 7.5
relaxation 7.5
future 7.4
care 7.4
cheerful 7.3
copy space 7.2
computer 7.2
sunset 7.2
team 7.2
door 7.1
day 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

window 99
person 98.9
clothing 98.2
text 97.1
standing 89.7
footwear 84.5
man 84.5
people 81.6
woman 74
group 73.9
dress 71
gallery 67.5
posing 63
room 44.1
picture frame 17.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 52-60
Gender Male, 99.7%
Happy 65.4%
Calm 32%
Surprised 1.2%
Confused 0.4%
Disgusted 0.4%
Sad 0.3%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 51-59
Gender Male, 99.9%
Calm 93.9%
Sad 4.2%
Surprised 1.1%
Happy 0.4%
Disgusted 0.2%
Fear 0.1%
Confused 0.1%
Angry 0.1%

AWS Rekognition

Age 26-36
Gender Female, 90.8%
Happy 87.5%
Calm 11.4%
Sad 0.5%
Surprised 0.2%
Confused 0.1%
Fear 0.1%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 27-37
Gender Male, 98.3%
Calm 99.8%
Happy 0.2%
Surprised 0%
Disgusted 0%
Confused 0%
Sad 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 33-41
Gender Male, 99.3%
Calm 69.6%
Happy 28%
Sad 0.7%
Confused 0.6%
Surprised 0.4%
Disgusted 0.4%
Angry 0.2%
Fear 0.2%

AWS Rekognition

Age 30-40
Gender Male, 96.5%
Happy 88.8%
Sad 6.3%
Confused 1.8%
Surprised 1.2%
Calm 0.8%
Disgusted 0.6%
Fear 0.3%
Angry 0.2%

AWS Rekognition

Age 52-60
Gender Male, 92.5%
Calm 100%
Sad 0%
Confused 0%
Surprised 0%
Angry 0%
Disgusted 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 20-28
Gender Male, 99.7%
Calm 54.9%
Sad 22.4%
Confused 11.2%
Fear 5.1%
Disgusted 2.6%
Happy 2%
Angry 1%
Surprised 0.8%

AWS Rekognition

Age 43-51
Gender Male, 97.3%
Sad 85.7%
Calm 11.9%
Disgusted 0.6%
Confused 0.5%
Angry 0.5%
Happy 0.3%
Surprised 0.2%
Fear 0.2%

AWS Rekognition

Age 24-34
Gender Male, 81.6%
Happy 96.6%
Calm 1.8%
Sad 0.5%
Surprised 0.4%
Angry 0.2%
Confused 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 30-40
Gender Male, 97.3%
Calm 65.2%
Happy 10.8%
Angry 6.5%
Confused 5.6%
Sad 5.6%
Surprised 3.5%
Disgusted 1.9%
Fear 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Text analysis

Amazon

2