Human Generated Data

Title

Untitled (woman reclining in chair, reading, and smoking)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14635

Human Generated Data

Title

Untitled (woman reclining in chair, reading, and smoking)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Chair 93.1
Furniture 93.1
Human 92.8
Person 92.8
Clothing 92.5
Apparel 92.5
Person 83.7
Sitting 78.8
Female 78.5
Girl 66.3
Photo 65
Portrait 65
Face 65
Photography 65
Couch 63.3
Woman 62.6
Art 62.4
Plant 60.3
Flower 60.3
Blossom 60.3
Leisure Activities 56.8

Imagga
created on 2022-01-29

blackboard 81.4
classroom 29.1
room 28.8
male 26.9
people 26.2
man 24.9
business 24.3
businessman 23.8
person 23.4
musical instrument 23
chair 23
wind instrument 22
brass 21.9
office 20.1
adult 19.6
laptop 19.3
work 18.8
computer 18.5
group 16.9
table 16.5
women 14.2
indoors 14.1
modern 14
executive 14
sitting 13.7
job 13.3
interior 13.3
working 13.2
lifestyle 13
student 12.7
desk 12.4
education 12.1
men 12
home 12
happy 11.9
professional 11.8
team 11.6
smiling 11.6
stringed instrument 11.3
businesswoman 10.9
teacher 10.9
worker 10.8
corporate 10.3
day 10.2
communication 10.1
indoor 10
cornet 10
suit 9.9
sax 9.7
portrait 9.7
success 9.7
technology 9.6
businesspeople 9.5
bowed stringed instrument 9.5
meeting 9.4
study 9.3
casual 9.3
school 8.9
smile 8.5
black 8.4
newspaper 8.4
trombone 8.4
house 8.4
holding 8.3
occupation 8.2
confident 8.2
style 8.2
board 8.1
cheerful 8.1
copy space 8.1
looking 8
urban 7.9
employee 7.8
boy 7.8
glass 7.8
class 7.7
exam 7.7
boss 7.6
studio 7.6
relax 7.6
manager 7.4
product 7.4
window 7.3
successful 7.3
violin 7.2
music 7.2
handsome 7.1
face 7.1
happiness 7

Microsoft
created on 2022-01-29

text 99.6
furniture 96.8
person 87.1
chair 86.1
table 74.5
black and white 63.2
posing 39.6

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Female, 92.5%
Calm 99.7%
Sad 0.2%
Disgusted 0%
Surprised 0%
Confused 0%
Happy 0%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair 93.1%
Person 92.8%

Captions

Microsoft

a person sitting in front of a window 40%
a person standing in front of a window 39.9%
a person standing in front of a window 39.8%

Text analysis

Amazon

MJI7
MJI7 ACCHA
ACCHA

Google

MJ17
A
MJ17 YT3RA A
YT3RA