Human Generated Data

Title

Untitled (fashion model walking on floral carpet)

Date

1952

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14744

Human Generated Data

Title

Untitled (fashion model walking on floral carpet)

People

Artist: Jack Gould, American

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14744

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 97.7
Human 97.7
Canvas 93.8
Clothing 88.9
Apparel 88.9
Art 80.1
Rug 70.7
Text 67.2
Photography 61.7
Photo 61.7
Portrait 60.9
Face 60.9
Furniture 59.7
Bed 57.2
Floor 57.1
Drawing 56.6
Flooring 56.4
Advertisement 55.3
Evening Dress 55.2
Fashion 55.2
Gown 55.2
Robe 55.2

Clarifai
created on 2023-10-27

people 99
man 97.9
monochrome 97
woman 91.1
art 90.4
creativity 84.6
adult 84.6
two 81
business 80.6
indoors 80.3
illustration 80.1
family 79.9
fashion 74.8
retro 74.6
painting 72.6
leader 71.6
wear 71.3
actor 70.4
chalk out 69.6
option 68.8

Imagga
created on 2022-01-29

adult 35.1
people 32.3
person 32.3
computer 28.2
laptop 26.7
happy 25.7
business 25.5
sitting 24.9
work 23.5
office 23.4
smile 22.8
home 22.3
working 21.2
women 20.6
job 19.5
indoors 19.3
attractive 18.9
portrait 18.8
groom 18.6
corporate 18
male 17
man 16.9
pretty 16.1
smiling 15.9
desk 15.3
professional 15.1
businesswoman 14.5
lifestyle 14.5
worker 14.2
bedroom 14.1
room 14.1
sofa 13.8
one 13.4
technology 13.4
businessman 13.2
bride 12.9
jacket 12.7
student 12.7
brunette 12.2
executive 12
dress 11.7
cheerful 11.4
face 11.4
keyboard 11.3
education 11.3
book 11.2
casual 11
product 10.9
confident 10.9
relaxing 10.9
model 10.9
bed 10.8
couch 10.6
modern 10.5
thinking 10.4
newspaper 10.2
happiness 10.2
suit 10
book jacket 10
leisure 10
dinner dress 9.8
human 9.7
hair 9.5
reading 9.5
youth 9.4
manager 9.3
communication 9.2
wedding 9.2
creation 9.2
holding 9.1
lady 8.9
iron 8.9
success 8.9
sexy 8.8
corporation 8.7
love 8.7
expression 8.5
study 8.4
relaxation 8.4
house 8.4
fun 8.2
alone 8.2
blond 8.1
group 8.1
looking 8
paper 8
home appliance 7.8
hands 7.8
secretary 7.7
stress 7.7
wireless 7.6
resting 7.6
hand 7.6
businesspeople 7.6
fashion 7.5
meeting 7.5
lying 7.5
phone 7.4
light 7.4
20s 7.3
black 7.2
notebook 7.2
interior 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 99.7
wall 98.1
posing 86.5
person 85.3
standing 84.1
clothing 80.5
drawing 72.6
room 62.2
black and white 62.1
sketch 60.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Female, 98.1%
Happy 93.4%
Surprised 4.1%
Calm 1.1%
Angry 0.4%
Sad 0.3%
Fear 0.3%
Disgusted 0.2%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Rug
Person 97.7%
Rug 70.7%

Categories

Text analysis

Amazon

FILM
KODAK
KODAK SAFETY FILM
SAFETY
KODAK SAF
SAF

Google

KODA K S°A FETY FILM KODAK SA F
KODA
K
S°A
FETY
FILM
KODAK
SA
F