Human Generated Data

Title

Untitled (seven young children posed sitting in front of Christmas tree)

Date

1961

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9811

Human Generated Data

Title

Untitled (seven young children posed sitting in front of Christmas tree)

People

Artist: Martin Schweig, American 20th century

Date

1961

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9811

Machine Generated Data

Tags

Amazon
created on 2022-01-24

Person 99.3
Human 99.3
Person 98.8
Person 98.7
Person 98.1
Person 97.5
Person 96.8
Clothing 95.2
Apparel 95.2
Shorts 94.7
Person 94.2
Play 90
People 78.8
Furniture 77.1
Outdoors 77
Kid 76.8
Child 76.8
Nature 76.6
Indoors 75.7
Tree 72.7
Plant 72.7
Room 68.5
Couch 68.3
Living Room 67.1
Monitor 66.7
Electronics 66.7
Display 66.7
Screen 66.7
Sand 63.5
Baby 59.3
Helmet 58
Housing 56.8
Building 56.8
Photography 56.1
Photo 56.1
Floor 55.8
Face 55.6
Person 47.6

Clarifai
created on 2023-10-27

people 99.9
child 99.8
many 98.3
group 98.3
group together 98
boy 97.6
recreation 96.7
adult 96.4
education 95
woman 94.3
wear 94
man 92.9
school 92
competition 91.2
athlete 90.8
uniform 89.1
motion 88.6
enjoyment 88.1
adolescent 87.7
family 87

Imagga
created on 2022-01-24

runner 41.1
athlete 39
silhouette 31.4
person 30.9
people 24
sport 23.6
contestant 23.3
black 18.7
man 16.8
grunge 15.3
art 14.8
crowd 14.4
water 14
spectator 12.8
stage 12.2
design 11.8
vintage 11.6
symbol 11.4
old 11.1
motion 11.1
training 11.1
sky 10.8
light 10.8
male 10.6
fight 10.6
urban 10.5
success 10.5
adult 10.2
event 10.2
competition 10.1
world 10
city 10
group 9.7
muscular 9.5
icon 9.5
active 9.3
action 9.3
texture 9
activity 9
flag 8.9
stadium 8.9
dancer 8.9
fencing 8.9
sword 8.8
body 8.8
audience 8.8
dance 8.7
men 8.6
dangerous 8.6
business 8.5
energy 8.4
relaxation 8.4
lights 8.3
leisure 8.3
performer 8.3
platform 8.3
freedom 8.2
one 8.2
retro 8.2
protection 8.2
happy 8.1
dirty 8.1
building 8.1
graphic 8
weapon 7.9
stab 7.9
versus 7.9
technique 7.9
bright 7.9
cheering 7.8
happiness 7.8
nighttime 7.8
architecture 7.8
championship 7.8
blade 7.8
gear 7.7
match 7.7
summer 7.7
skill 7.7
patriotic 7.7
sharp 7.6
beach 7.6
nation 7.6
dark 7.5
clothing 7.5
life 7.4
vivid 7.4
style 7.4
paint 7.2
wet 7.2
hair 7.1
vibrant 7

Google
created on 2022-01-24

Microsoft
created on 2022-01-24

text 97.2
person 90.6
black and white 78.2
clothing 76
man 68
crowd 0.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-43
Gender Male, 89.9%
Calm 88.2%
Happy 6.5%
Sad 2.4%
Surprised 1%
Fear 0.7%
Disgusted 0.5%
Confused 0.5%
Angry 0.2%

AWS Rekognition

Age 34-42
Gender Female, 92.7%
Calm 87%
Sad 9.6%
Surprised 0.9%
Confused 0.7%
Happy 0.6%
Disgusted 0.5%
Fear 0.4%
Angry 0.4%

AWS Rekognition

Age 16-22
Gender Male, 95.3%
Calm 98.3%
Sad 0.7%
Angry 0.4%
Disgusted 0.2%
Confused 0.2%
Surprised 0.1%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 26-36
Gender Male, 100%
Calm 76.9%
Fear 8.6%
Surprised 3.4%
Disgusted 2.9%
Sad 2.6%
Angry 2.2%
Confused 2%
Happy 1.5%

AWS Rekognition

Age 28-38
Gender Male, 68.6%
Surprised 63%
Calm 33.6%
Happy 1.1%
Fear 0.9%
Sad 0.5%
Confused 0.3%
Angry 0.3%
Disgusted 0.2%

AWS Rekognition

Age 24-34
Gender Female, 68.6%
Calm 99%
Happy 0.4%
Fear 0.3%
Sad 0.1%
Disgusted 0.1%
Angry 0.1%
Confused 0%
Surprised 0%

AWS Rekognition

Age 33-41
Gender Male, 98.4%
Calm 86%
Sad 4.9%
Surprised 2.8%
Disgusted 1.6%
Fear 1.4%
Confused 1.3%
Angry 1.2%
Happy 0.8%

Feature analysis

Amazon

Person
Helmet
Person 99.3%
Person 98.8%
Person 98.7%
Person 98.1%
Person 97.5%
Person 96.8%
Person 94.2%
Person 47.6%
Helmet 58%

Text analysis

Amazon

MJ17--YT37A°--XX

Google

MJI7--YT3RA°2-- XAGO
MJI7--YT3RA°2--
XAGO