Human Generated Data

Title

Untitled (girls playing with dolls)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16897

Human Generated Data

Title

Untitled (girls playing with dolls)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16897

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 94.4
Human 94.4
Furniture 78.8
Helmet 75.5
Clothing 75.5
Apparel 75.5
Person 70.4
Screen 66.8
Electronics 66.8
Monitor 64.3
Display 64.3
Kid 56.6
Child 56.6
Floor 56
Overcoat 55.4
Coat 55.4

Clarifai
created on 2023-10-29

people 99.9
two 98.4
adult 98.4
child 96.7
furniture 96.3
woman 96.2
chair 96.2
man 95.9
elderly 95.3
music 95.2
monochrome 94.2
vehicle 94.2
seat 94.1
group together 94.1
group 93.7
wear 92.7
medical practitioner 92.3
musician 92.1
stretcher 91.5
three 91

Imagga
created on 2022-02-26

musical instrument 45.1
stringed instrument 38.7
violin 27.4
music 23.2
bowed stringed instrument 23.1
person 22.7
guitar 21.4
chair 21.1
man 20.8
people 20.6
play 18.1
trombone 18
adult 17.7
musician 17.5
brass 17.4
instrument 15.3
banjo 14.4
wind instrument 14.4
male 14.2
lifestyle 13.7
playing 13.7
portrait 12.9
sitting 12.9
computer 12.8
casual 12.7
attractive 12.6
happy 12.5
men 12
string 11.6
musical 11.5
working 11.5
laptop 11.1
work 11
business 10.9
concert 10.7
technology 10.4
black 10.2
smile 10
park 9.9
hand 9.9
acoustic 9.8
seat 9.6
sit 9.4
device 9.4
stretcher 9
equipment 8.9
one 8.9
job 8.8
acoustic guitar 8.8
women 8.7
grandfather 8.7
classical 8.6
sound 8.4
old 8.3
bench 8.2
student 8.1
worker 8
indoors 7.9
melody 7.8
boy 7.8
modern 7.7
outdoor 7.6
fashion 7.5
outdoors 7.5
rotisserie 7.3
alone 7.3
indoor 7.3
lady 7.3
smiling 7.2
office 7.2
sexy 7.2
litter 7.2
handsome 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

black and white 96.4
person 93.1
text 88.9
clothing 64.7
monochrome 56
music 54.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 16-22
Gender Female, 87.6%
Happy 61.8%
Calm 26%
Surprised 4.8%
Sad 4.6%
Fear 0.9%
Angry 0.8%
Disgusted 0.7%
Confused 0.3%

AWS Rekognition

Age 19-27
Gender Female, 65.9%
Calm 99.8%
Sad 0.1%
Surprised 0%
Happy 0%
Disgusted 0%
Angry 0%
Confused 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Helmet
Person 94.4%
Person 70.4%
Helmet 75.5%

Categories

Imagga

interior objects 89.3%
paintings art 5.3%
food drinks 2.6%

Text analysis

Amazon

TST