Human Generated Data

Title

Untitled (woman in coat and head scarf crossing city street)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14895

Human Generated Data

Title

Untitled (woman in coat and head scarf crossing city street)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Apparel 99.9
Clothing 99.9
Person 99.5
Human 99.5
Person 98.9
Dress 97.8
Person 97.7
Female 96.6
Person 95
Person 92.9
Woman 91.1
Person 84.2
Fashion 80.6
Gown 80.6
Robe 79.3
Transportation 75.8
Automobile 75.8
Car 75.8
Vehicle 75.8
Person 75.4
Overcoat 75.4
Suit 75.4
Coat 75.4
Bus 75.3
Car 73.8
Shorts 71.6
Photography 69.9
Portrait 69.9
Face 69.9
Photo 69.9
Pedestrian 69.1
Path 67.8
People 64.8
Skirt 64.6
Evening Dress 60.2
Wedding 59.8
Wedding Gown 58.4
Art 56.9
Drawing 56.9
Person 55.8
Crowd 55.4
Floor 55.2

Imagga
created on 2022-01-29

newspaper 100
product 90.6
creation 70.3
daily 50.7
design 20.2
old 20.2
architecture 19.5
building 19.1
wall 17.9
grunge 17
business 16.4
empty 15.4
construction 14.5
dirty 14.4
sign 14.3
space 14
blank 13.7
texture 13.2
antique 13
ancient 13
art 12.4
vintage 12.4
interior 12.4
retro 12.3
street 12
paper 11.8
work 11.8
businessman 11.5
plan 11.3
rough 10.9
paint 10.9
aged 10.8
sky 10.8
frame 10.8
male 10.6
job 10.6
engineer 10.6
pattern 10.2
border 9.9
road 9.9
office 9.8
new 9.7
designer 9.7
chart 9.6
damaged 9.5
engineering 9.5
grungy 9.5
city 9.1
industrial 9.1
material 8.9
color 8.9
designing 8.9
structure 8.9
copy 8.8
manufacturing 8.8
urban 8.7
architect 8.7
cloud 8.6
weathered 8.5
writing 8.5
floor 8.4
decorative 8.3
technology 8.2
graphic 8
looking 8
text 7.9
organizer 7.8
black 7.8
pensive 7.8
builder 7.7
project 7.7
architectural 7.7
pencil 7.6
house 7.5
manager 7.4
man 7.4
room 7.3
people 7.2
detail 7.2
home 7.2
surface 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 99.9
black and white 91.2
outdoor 85.4
wedding dress 79.7
person 77.2
dress 75.5
clothing 75.3
street 64
bride 55.4
woman 52.6

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Male, 99.6%
Calm 56.5%
Surprised 15.9%
Sad 13.2%
Happy 6.7%
Confused 4.4%
Angry 1.9%
Disgusted 0.9%
Fear 0.6%

AWS Rekognition

Age 29-39
Gender Male, 84.5%
Calm 90.5%
Sad 3.7%
Angry 2.6%
Disgusted 1%
Happy 0.8%
Confused 0.5%
Surprised 0.5%
Fear 0.3%

AWS Rekognition

Age 23-31
Gender Female, 86.2%
Calm 55.3%
Sad 26.3%
Confused 8.8%
Angry 2.2%
Fear 2.2%
Happy 1.9%
Disgusted 1.7%
Surprised 1.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Car 75.8%
Bus 75.3%

Captions

Microsoft

a woman standing in front of a building 86.8%
a woman standing in front of a tall building 77.2%
a woman standing next to a building 77.1%

Text analysis

Amazon

KODAK
SAFETY
SAFETY FILM
CO.
FILM
E
OWNIN
OWNIN S-KING & CO.
B
&
4565
S-KING
PARK
PARK RIDE
RIDE
V
SU

Google

OWNI
CO.
KODAK
KING&
S'AFETY
OWNI KING& CO. KODAK S'AFETY FILM KODAK
FILM