Human Generated Data

Title

Untitled (women holding toys)

Date

1951

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20248

Human Generated Data

Title

Untitled (women holding toys)

People

Artist: Peter James Studio, American

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20248

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.3
Human 99.3
Person 98.7
Musician 98.6
Musical Instrument 98.6
Person 98
Person 97.7
Person 97.6
Interior Design 97.2
Indoors 97.2
Person 97
Person 94.6
Leisure Activities 88.6
Drum 86.3
Percussion 86.3
Music Band 85.1
Performer 58.8
Guitarist 55.1
Guitar 55.1

Clarifai
created on 2023-10-22

people 99.7
group 99.2
woman 98.4
adult 98.3
group together 98
drum 97.1
monochrome 96.6
man 96.6
music 96.5
musician 94.4
drummer 93.8
actor 93.5
percussion instrument 93.5
child 92.8
retro 90.3
instrument 88.3
nostalgia 87.7
sitting 86.4
singer 85.8
wear 85.7

Imagga
created on 2022-03-05

musical instrument 37.9
wheelchair 35.4
drum 33.4
percussion instrument 28.4
man 24.8
adult 22.8
person 21.9
male 20.7
people 20.6
chair 20.6
banjo 17.3
wheel 17
seat 16.1
outdoors 15.7
happy 15.6
sport 15.6
attractive 15.4
smiling 15.2
men 14.6
stringed instrument 13.9
portrait 13.6
human 13.5
fun 13.5
world 12.8
smile 12.1
outside 12
health 11.8
music 10.9
disabled 10.8
care 10.7
face 10.6
athlete 10.5
pretty 10.5
looking 10.4
happiness 10.2
training 10.2
street 10.1
active 10.1
exercise 10
fitness 9.9
guitar 9.7
summer 9.6
black 9.6
lifestyle 9.4
cute 9.3
sexy 8.8
healthy 8.8
bicycle 8.8
body 8.8
play 8.6
sitting 8.6
model 8.5
vehicle 8.4
senior 8.4
outdoor 8.4
equipment 8.4
old 8.3
player 8.3
ball 8.3
fashion 8.3
holding 8.2
furniture 8
posing 8
women 7.9
together 7.9
couple 7.8
bike 7.8
sick 7.7
expression 7.7
musical 7.7
help 7.4
device 7.3
cheerful 7.3
playing 7.3
musician 7.2
recreation 7.2
clothing 7.1
working 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

musical instrument 93.4
person 93.2
drum 92.8
black and white 84.6
window 82.5
clothing 81.7
text 77
man 60.6
guitar 54.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 98.1%
Surprised 79.2%
Happy 8.5%
Confused 5.6%
Calm 4.7%
Fear 0.9%
Disgusted 0.5%
Angry 0.3%
Sad 0.2%

AWS Rekognition

Age 40-48
Gender Male, 96.8%
Fear 42%
Confused 24.2%
Sad 14.4%
Happy 5.9%
Surprised 5.9%
Disgusted 3%
Calm 3%
Angry 1.7%

AWS Rekognition

Age 45-51
Gender Male, 99.8%
Calm 64.7%
Happy 29.5%
Sad 1.5%
Surprised 1.4%
Fear 0.9%
Confused 0.8%
Disgusted 0.8%
Angry 0.3%

AWS Rekognition

Age 38-46
Gender Male, 99.4%
Calm 92.1%
Surprised 5.2%
Happy 1.5%
Confused 0.5%
Disgusted 0.3%
Sad 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 39-47
Gender Male, 74%
Happy 36.1%
Calm 36%
Sad 18.4%
Surprised 3.2%
Confused 3.1%
Disgusted 1.5%
Fear 1.2%
Angry 0.5%

AWS Rekognition

Age 37-45
Gender Female, 91.8%
Calm 96.1%
Surprised 1.9%
Happy 1.2%
Confused 0.3%
Disgusted 0.2%
Fear 0.2%
Sad 0.1%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.3%
Person 98.7%
Person 98%
Person 97.7%
Person 97.6%
Person 97%
Person 94.6%

Categories

Text analysis

Amazon

E
KODVK-EELA