Human Generated Data

Title

Untitled (boy seated at piano, flanked by siblings)

Date

c. 1940

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12437

Human Generated Data

Title

Untitled (boy seated at piano, flanked by siblings)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.4
Human 99.4
Furniture 98.3
Chair 98.3
Shorts 96.1
Clothing 96.1
Apparel 96.1
Person 92.6
Footwear 87.1
Shoe 87.1
Indoors 81.4
Room 81.4
Person 81.2
Flooring 74.2
Shoe 72
People 61.5
Electronics 57
Display 57
Monitor 57
Screen 57
Floor 56.5
Female 55

Imagga
created on 2022-01-29

classroom 73.9
room 70.3
man 39.6
male 32.6
people 32.3
person 31.6
teacher 29.8
barbershop 25.2
adult 22.6
men 21.5
office 21.2
businessman 21.2
smiling 21
happy 20.7
sitting 20.6
indoors 20.2
business 20
shop 19.2
computer 18.5
couple 18.3
senior 17.8
professional 17.7
table 17.7
laptop 17.6
indoor 16.4
educator 15.7
cheerful 15.4
chair 15.4
mercantile establishment 15.2
desk 15.1
mature 14.9
two 14.4
women 14.2
home 13.6
smile 13.5
mid adult 13.5
hospital 13.1
education 13
group 12.9
happiness 12.5
together 12.3
patient 12
back 11.9
work 11.8
30s 11.5
job 11.5
working 11.5
meeting 11.3
blackboard 11.2
corporate 11.2
executive 11.1
family 10.7
school 10.5
place of business 10.1
holding 9.9
suit 9.9
modern 9.8
nurse 9.6
looking 9.6
teamwork 9.3
student 9.2
phone 9.2
horizontal 9.2
furniture 9.2
worker 9.1
team 9
handsome 8.9
interior 8.8
colleagues 8.7
boy 8.7
lifestyle 8.7
businesspeople 8.5
casual 8.5
communication 8.4
study 8.4
color 8.3
businesswoman 8.2
new 8.1
success 8
to 8
medical 7.9
seat 7.9
love 7.9
face 7.8
teaching 7.8
two people 7.8
portrait 7.8
class 7.7
old 7.7
finance 7.6
pair 7.6
relationship 7.5
board 7.2
black 7.2
case 7.1
day 7.1

Microsoft
created on 2022-01-29

text 98.1
furniture 93.1
piano 87.4
person 73.4
clothing 66.4
gallery 66.3
chair 65.9
table 60.4

Face analysis

Amazon

Google

AWS Rekognition

Age 18-26
Gender Female, 71.9%
Sad 62.5%
Calm 22.9%
Fear 6%
Confused 2.5%
Happy 2.3%
Disgusted 1.7%
Angry 1.1%
Surprised 1%

AWS Rekognition

Age 36-44
Gender Female, 97.3%
Calm 99.7%
Surprised 0.2%
Sad 0%
Disgusted 0%
Confused 0%
Happy 0%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Chair 98.3%
Shoe 87.1%

Captions

Microsoft

a group of people in a room 89.8%
a group of people standing in a room 87.2%
a group of people sitting in a room 81.5%

Text analysis

Amazon

3
7
HX
POSE
HX 3 7 O G
ЕГГН
VELV SALETA ЕГГН
VELV
SALETA
O
G