Human Generated Data

Title

Untitled (woman in long dress playing Steinway piano)

Date

c. 1930

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12408

Human Generated Data

Title

Untitled (woman in long dress playing Steinway piano)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12408

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Clarifai
created on 2023-10-27

piano 100
music 100
pianist 100
musician 99.9
instrument 99.9
people 99.7
jazz 99.7
classical music 99.6
one 98.8
art 97.6
rehearsal 96.7
adult 95.8
monochrome 94.7
man 94.7
opera 93.7
synthesizer 93.7
keyboard 93.3
play 92.9
singer 92
two 91.9

Imagga
created on 2022-01-29

person 28.8
man 28.2
nurse 27.9
patient 27
surgeon 25.6
people 25.1
male 22.7
hospital 21.6
home 21.5
indoors 20.2
adult 20.1
professional 19.6
medical 19.4
room 18.3
worker 17.9
health 16.7
specialist 16
device 15.5
doctor 15
working 15
medicine 15
work 14.9
equipment 14
men 13.7
lifestyle 13.7
case 13.7
sick person 13.5
clinic 13.3
interior 13.3
office 13.1
salon 13.1
business 12.7
chair 12.5
businessman 12.4
portrait 12.3
face 12.1
happy 11.9
surgery 11.7
emergency 11.6
uniform 11.5
mask 11.5
profession 11.5
modern 11.2
women 11.1
inside 11
casual 11
indoor 10.9
house 10.9
smiling 10.8
care 10.7
machine 10.6
job 10.6
surgical 9.9
computer 9.7
technology 9.6
looking 9.6
illness 9.5
corporate 9.4
sitting 9.4
occupation 9.2
alone 9.1
human 9
negative 9
family 8.9
furniture 8.7
standing 8.7
clothing 8.6
appliance 8.6
smile 8.5
black 8.4
hand blower 8.4
one 8.2
girls 8.2
life 8.1
hair 7.9
operation 7.9
sterile 7.9
instrument 7.7
dryer 7.7
exam 7.7
research 7.6
communication 7.6
seat 7.5
one person 7.5
senior 7.5
mature 7.4
team 7.2
film 7.1
architecture 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Female, 99.5%
Calm 96.6%
Sad 2%
Surprised 0.4%
Happy 0.3%
Fear 0.2%
Angry 0.2%
Disgusted 0.2%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Piano
Person 99.4%
Piano 85.5%

Text analysis

Amazon

3
742
DALL 5 3 742
5
DALL
USES
E SONS
50 USES ad
50
EST
SIGNATURE E SONS
ad
SIGNATURE

Google

MAGON-YT3RA2-MAMT2A 3854 S SOS DAL 5 3 74
MAGON-YT3RA2-MAMT2A
3854
S
SOS
DAL
5
3
74