Human Generated Data

Title

Untitled (people gathered around table learning to tie scarves)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15295

Human Generated Data

Title

Untitled (people gathered around table learning to tie scarves)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.4
Human 99.4
Clinic 96.4
Person 94.9
Person 94.8
Person 94.2
Hospital 90.4
Person 88.5
Person 76.3
Operating Theatre 69.4
Sitting 64.6
Indoors 63.4
Room 59.5

Imagga
created on 2022-03-05

room 53
home 39.3
interior 38
house 35.9
table 30.9
indoors 29.9
modern 28.7
furniture 27.2
office 25
computer 24.1
man 22.9
laptop 22.8
people 22.3
sofa 21.2
business 20.7
person 20.5
window 20.2
smiling 19.5
chair 19.4
male 18.4
indoor 18.3
living 18
adult 17.9
classroom 17.9
professional 17.7
happy 17.5
apartment 17.2
sitting 17.2
meeting 17
businessman 16.8
floor 16.7
women 16.6
inside 16.6
work 16.5
luxury 16.3
architecture 16
communication 16
decor 15.9
working 15.9
design 15.2
wood 15
building 14.9
desk 14.9
domestic 14.6
businesswoman 14.5
lifestyle 14.5
lamp 14.3
light 14
together 14
men 13.7
life 13.6
residential 13.4
3d 13.2
relaxation 12.6
team 12.5
comfortable 12.4
couple 12.2
corporate 12
portrait 11.6
smile 11.4
businesspeople 11.4
technology 11.1
teamwork 11.1
wall 11.1
structure 10.8
couch 10.6
group 10.5
talking 10.5
education 10.4
happiness 10.2
teacher 10.1
color 10
living room 9.8
door 9.7
bedroom 9.5
contemporary 9.4
space 9.3
confident 9.1
cheerful 8.9
success 8.9
hall 8.8
looking 8.8
bed 8.5
library 8.5
fashion 8.3
style 8.2
family 8
carpet 7.8
lighting 7.7
project 7.7
hotel 7.6
two 7.6
estate 7.6
executive 7.5
showing 7.5
window shade 7.4
occupation 7.3
new 7.3
stylish 7.2
handsome 7.1
job 7.1
glass 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

furniture 93.6
person 93.1
table 93.1
text 82.6
house 77.7
chair 75.9
drawing 69.3
man 67
clothing 65.8
black and white 52.4

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 96.9%
Calm 66.9%
Happy 7.9%
Fear 5.1%
Sad 4.7%
Surprised 4.7%
Confused 4%
Disgusted 3.7%
Angry 3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a group of people in a room 87.8%
a group of people sitting in a room 81.2%
a group of people sitting on a bed 45.5%

Text analysis

Amazon

KODAK
4
SAFETY
KOD
KODAK SAFETY FILM
FILM
C

Google

S'AFETY
FILM
KOD
КODAK S'AFETY FILM KOD
КODAK