Human Generated Data

Title

Untitled (children in line waiting to receive Christmas gifts)

Date

1948

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19376

Human Generated Data

Title

Untitled (children in line waiting to receive Christmas gifts)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19376

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.8
Human 99.8
Clothing 99.8
Apparel 99.8
Person 99.6
Person 99.4
Person 99
Person 99
Shorts 99
Person 97.6
Person 93
Shoe 92.4
Footwear 92.4
Person 92.1
Dress 91.7
Shoe 91.5
Person 90.6
Shoe 86.9
Female 83
Shoe 78.2
Person 74.5
Person 73.8
Shoe 73.5
Person 72.8
Skirt 70.2
Kid 68.8
Child 68.8
Person 68.7
Girl 66.2
Shoe 64.1
Shoe 63.3
Woman 63.1
Person 62.8
Shoe 62.2
Poster 58.6
Advertisement 58.6
Shoe 57.1
Play 55.9
Shoe 50.7
Person 46.4
Person 42.9

Clarifai
created on 2023-10-22

people 99.9
child 99.5
group 99.1
man 96.7
monochrome 96.6
many 96.3
adult 94.9
woman 94.8
group together 94.5
boy 91.2
dancing 90.8
crowd 90.6
son 88.2
music 84.5
art 84.2
recreation 82.7
school 82
family 81.7
wear 81.2
education 81

Imagga
created on 2022-03-05

brass 56
wind instrument 43.6
people 32.9
silhouette 32.3
musical instrument 30.7
man 26.2
person 24.5
male 23.4
group 23.4
sunset 21.6
adult 20.9
men 19.7
cornet 18.2
beach 17.7
trombone 17.3
together 16.6
team 16.1
women 15.8
human 15.7
businessman 15
couple 13.9
happy 13.8
success 13.7
fun 13.5
black 13.4
business 13.4
walking 13.3
sky 12.7
sport 12.6
active 12.5
world 12.4
lifestyle 12.3
teamwork 12
summer 11.6
crowd 11.5
fashion 11.3
boy 11.3
body 11.2
water 10.7
sun 10.5
friendship 10.3
work 10.2
child 10.2
professional 10.1
ocean 10
dance 9.9
dress 9.9
dancer 9.9
vacation 9.8
ball 9.8
lady 9.7
outdoors 9.7
silhouettes 9.7
style 9.6
love 9.5
evening 9.3
teen 9.2
outdoor 9.2
attractive 9.1
family 8.9
posing 8.9
sexy 8.8
run 8.7
sea 8.6
corporate 8.6
dusk 8.6
model 8.5
modern 8.4
dark 8.3
life 8.3
action 8.3
leisure 8.3
competition 8.2
athlete 8.2
recreation 8.1
device 8.1
performer 8
teacher 7.9
sand 7.9
happiness 7.8
standing 7.8
portrait 7.8
baritone 7.7
motion 7.7
wall 7.7
kin 7.6
walk 7.6
hand 7.6
design 7.3
graphic 7.3
teenager 7.3
businesswoman 7.3
exercise 7.3
fitness 7.2
copy space 7.2
looking 7.2
shadow 7.2
holiday 7.2
father 7.1
travel 7

Google
created on 2022-03-05

Black 89.6
Black-and-white 86.5
Standing 86.4
Sleeve 85.9
Style 84.1
Art 82.7
Font 80.4
Adaptation 79.3
Monochrome 79
Monochrome photography 77
Vintage clothing 75.1
Pattern 72.7
Event 72.6
Team 71.4
Fashion design 70.2
Design 68.6
Visual arts 68.5
Fun 68
History 66.3
Room 65.8

Microsoft
created on 2022-03-05

person 99.5
clothing 97
street 92.7
people 91.5
footwear 90.2
black and white 89.8
text 87.9
indoor 86.7
group 56.7
crowd 48.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Female, 85.9%
Surprised 56%
Calm 28.7%
Disgusted 4.7%
Sad 2.9%
Angry 2.6%
Fear 2.5%
Confused 1.9%
Happy 0.6%

AWS Rekognition

Age 29-39
Gender Female, 90%
Surprised 94.1%
Calm 2.8%
Fear 1.5%
Angry 0.6%
Sad 0.5%
Disgusted 0.2%
Confused 0.2%
Happy 0.1%

AWS Rekognition

Age 45-51
Gender Male, 95.2%
Sad 94.7%
Confused 1.8%
Calm 1.1%
Happy 1%
Surprised 0.6%
Fear 0.3%
Disgusted 0.3%
Angry 0.3%

AWS Rekognition

Age 38-46
Gender Male, 96.5%
Calm 72%
Happy 19.8%
Sad 2.7%
Confused 1.8%
Disgusted 1.7%
Surprised 1%
Angry 0.6%
Fear 0.4%

AWS Rekognition

Age 25-35
Gender Female, 63.6%
Surprised 82.6%
Calm 13.2%
Fear 2.5%
Sad 0.4%
Disgusted 0.4%
Confused 0.4%
Angry 0.3%
Happy 0.1%

AWS Rekognition

Age 26-36
Gender Female, 51.4%
Surprised 83.5%
Calm 15.2%
Fear 0.7%
Disgusted 0.2%
Happy 0.1%
Angry 0.1%
Confused 0.1%
Sad 0%

AWS Rekognition

Age 12-20
Gender Male, 99.8%
Calm 99.7%
Sad 0.1%
Angry 0.1%
Confused 0.1%
Surprised 0.1%
Disgusted 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 22-30
Gender Male, 98%
Calm 98.5%
Surprised 0.8%
Sad 0.2%
Angry 0.2%
Confused 0.1%
Disgusted 0.1%
Happy 0.1%
Fear 0%

AWS Rekognition

Age 41-49
Gender Male, 70%
Fear 87.5%
Calm 6.8%
Happy 1.8%
Sad 1.8%
Disgusted 1%
Confused 0.5%
Angry 0.4%
Surprised 0.3%

AWS Rekognition

Age 21-29
Gender Male, 99.1%
Calm 99%
Sad 0.6%
Surprised 0.2%
Angry 0.1%
Confused 0.1%
Disgusted 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 13-21
Gender Female, 64.7%
Happy 27%
Disgusted 25.5%
Calm 19.7%
Fear 15.6%
Surprised 6%
Angry 4.6%
Sad 0.8%
Confused 0.7%

AWS Rekognition

Age 22-30
Gender Male, 87.5%
Calm 58.8%
Sad 25.1%
Fear 11.6%
Surprised 1.9%
Angry 1%
Disgusted 0.9%
Confused 0.4%
Happy 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.8%
Person 99.6%
Person 99.4%
Person 99%
Person 99%
Person 97.6%
Person 93%
Person 92.1%
Person 90.6%
Person 74.5%
Person 73.8%
Person 72.8%
Person 68.7%
Person 62.8%
Person 46.4%
Person 42.9%
Shoe 92.4%
Shoe 91.5%
Shoe 86.9%
Shoe 78.2%
Shoe 73.5%
Shoe 64.1%
Shoe 63.3%
Shoe 62.2%
Shoe 57.1%
Shoe 50.7%

Text analysis

Amazon

TE
KODOK-SE