Human Generated Data

Title

Untitled (people sitting in airport waiting area, man and woman)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16059.3

Human Generated Data

Title

Untitled (people sitting in airport waiting area, man and woman)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Apparel 99.9
Clothing 99.9
Human 99.7
Person 99.7
Person 99.3
Person 97.3
Person 90.5
Female 82.3
Shorts 80.8
Shoe 78.7
Footwear 78.7
Dance Pose 70.4
Leisure Activities 70.4
Woman 65.7
Text 64.9
Door 63
Advertisement 62
Poster 62
Sleeve 58.1
Photo 57.2
Portrait 57.2
Photography 57.2
Face 57.2
Flooring 56.3
Skirt 55.6

Imagga
created on 2022-02-11

person 28
people 23.4
newspaper 23.1
product 22.9
clothing 19.3
creation 19.1
sexy 18.5
silhouette 18.2
attractive 18.2
fashion 18.1
body 16.8
sport 16.5
man 14.8
pretty 14.7
hair 14.3
portrait 14.2
player 14.1
model 14
adult 13.6
event 12.9
flag 12.8
style 12.6
crowd 12.5
patriotic 12.5
posing 12.4
lights 12
equipment 12
black 12
training 12
cheering 11.7
nighttime 11.7
audience 11.7
stadium 11.7
muscular 11.5
athlete 11.4
nation 11.4
male 11.3
brunette 11.3
television 10.8
symbol 10.8
cute 10.8
covering 10.7
studio 10.6
skill 10.6
standing 10.4
brassiere 10.4
icon 10.3
lifestyle 10.1
competition 10.1
happy 10
smile 10
park 9.8
shorts 9.8
lady 9.7
fight 9.7
match 9.6
work 9.1
sensual 9.1
exercise 9.1
make 9.1
design 9
human 9
active 9
negative 8.9
versus 8.9
championship 8.8
women 8.7
film 8.7
face 8.5
clothes 8.4
woman's clothing 8.3
undergarment 8.3
makeup 8.2
garment 8.2
art 8.2
stylish 8.1
fitness 8.1
music 8.1
shiny 7.9
business 7.9
vibrant 7.9
3d 7.7
expression 7.7
professional 7.7
skin 7.6
healthy 7.6
legs 7.5
field 7.5
one 7.5
backboard 7.4
action 7.4
star 7.3
gorgeous 7.2
pose 7.2
consumer goods 7.2
bright 7.1

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

text 98.9
dance 98.7
person 96.7
clothing 89.9
woman 88.3
dress 74.5

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 84.1%
Sad 93.3%
Calm 5.4%
Confused 0.4%
Angry 0.3%
Disgusted 0.2%
Surprised 0.1%
Happy 0.1%
Fear 0.1%

AWS Rekognition

Age 35-43
Gender Female, 81.4%
Calm 81.7%
Surprised 16.8%
Sad 0.7%
Confused 0.4%
Happy 0.2%
Disgusted 0.1%
Angry 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a group of people standing in front of a window 58.3%
a group of people in front of a window 57.6%
a group of people sitting in front of a window 45.7%

Text analysis

Amazon

7E
KODAK
SAFETY
KODAK SAFETY FILM
FILM
ILM

Google

KODAK
S'AFETY
7E
ILM
7E ILM KODAK S'AFETY FILM
FILM