Human Generated Data

Title

Untitled (crowd outside department store)

Date

1950

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6265

Human Generated Data

Title

Untitled (crowd outside department store)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 98.4
Human 98.4
Transportation 97.9
Car 97.9
Automobile 97.9
Vehicle 97.9
Interior Design 96.3
Indoors 96.3
Person 96.1
Car 94.8
Sedan 93.6
Person 93.3
Person 92.2
Person 89.5
Machine 87.8
Wheel 87.8
Car 87.5
Person 85
Car 84.3
Person 76.1
Car 74.5
Person 72
Person 69.7
Person 68.5
Person 66.5
Person 65.8
Person 65.6
Sports Car 62.4
Bumper 58.8
Home Decor 58.6
Building 58.2
Crowd 57.7
Coupe 56.8
Car 56.5
Car 55.7
Person 48.3

Imagga
created on 2022-01-22

architecture 34.9
building 33.6
city 29.9
structure 28
urban 27.1
hall 24.1
room 23.4
modern 22.4
sketch 21.5
deck 21
office 20.3
center 19.4
construction 18.8
interior 18.6
cinema 18.5
transportation 17.9
gymnasium 17.9
sky 17.9
classroom 17.7
business 17.6
house 16.7
theater 14.8
buildings 14.2
glass 14
floor 13.9
travel 13.4
facility 13
athletic facility 12.8
drawing 12.4
steel 12.4
metal 12.1
station 11.9
transport 11.9
light 11.4
bridge 11.4
design 11.3
supermarket 11.1
window 11
reflection 10.6
table 10.4
industry 10.2
water 10
billboard 9.8
traffic 9.5
chair 9.4
wall 9.4
3d 9.3
representation 9.2
grocery store 9
furniture 8.7
roof 8.6
inside 8.3
tourism 8.3
vacation 8.2
industrial 8.2
road 8.1
landmark 8.1
night 8
offices 7.9
nobody 7.8
empty 7.7
residential 7.7
lamp 7.6
development 7.6
cityscape 7.6
destination 7.5
evening 7.5
landscape 7.4
exterior 7.4
street 7.4
indoor 7.3
new 7.3
school 7.2
signboard 7.1
river 7.1
day 7.1

Microsoft
created on 2022-01-22

building 99.9
text 98.7
black and white 93
outdoor 92.9
car 89.5
vehicle 88.9
land vehicle 76.1
several 25

Face analysis

Amazon

AWS Rekognition

Age 23-33
Gender Female, 72.1%
Sad 67.6%
Happy 11.8%
Fear 7.1%
Calm 5.1%
Angry 2.7%
Confused 2.3%
Surprised 1.9%
Disgusted 1.5%

AWS Rekognition

Age 20-28
Gender Female, 96.4%
Calm 75%
Happy 16.2%
Angry 3%
Sad 2.6%
Surprised 1.3%
Disgusted 1.1%
Fear 0.6%
Confused 0.3%

Feature analysis

Amazon

Person 98.4%
Car 97.9%
Wheel 87.8%

Captions

Microsoft

a group of people in front of a building 88.3%
a group of people standing in front of a building 85%
a group of people in a car 64.3%

Text analysis

Amazon

CO.
SHOP
J.C.PENNEY CO.
S
J.C.PENNEY
TO
50 TO
Marty
50
day
KOBAK-
KOBAK- EVELTA
EVELTA

Google

PENNEY
J.C. PENNEY CO. gMarly SHOP
SHOP
J.C.
CO.
gMarly