Human Generated Data

Title

Untitled (interior view of department store with shoppers walking around and parents looking at red wagons in center)

Date

1940-1950

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6542

Human Generated Data

Title

Untitled (interior view of department store with shoppers walking around and parents looking at red wagons in center)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1940-1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6542

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Human 99.5
Person 99.5
Person 99
Person 99
Wheel 98.7
Machine 98.7
Wheel 98.2
Bicycle 96.5
Bike 96.5
Transportation 96.5
Vehicle 96.5
Person 95.2
Person 94.5
Person 93.7
Person 93
Person 84.3
Shop 79.9
Person 79.2
Person 77.4
Person 71.1
Monitor 63.5
Electronics 63.5
Screen 63.5
Display 63.5
Bazaar 57.2
Market 57.2
Person 43.3

Clarifai
created on 2019-03-25

people 99.9
group together 98.6
group 98.6
vehicle 98.5
adult 97.9
man 96.5
street 95.6
transportation system 93.5
monochrome 93
many 92.9
stock 91.8
woman 89.6
crowd 88.8
furniture 88.5
commerce 85.2
recreation 84.8
administration 84.3
child 84
bike 83.3
several 81.6

Imagga
created on 2019-03-25

shop 29.3
people 26.7
business 21.2
man 20.8
mercantile establishment 19.1
person 18.6
male 18.4
stall 17.6
men 17.2
bartender 16.5
businessman 15.9
city 14.9
barbershop 14.8
seller 14
room 12.9
women 12.6
place of business 12.5
interior 12.4
group 12.1
work 11.9
indoor 11.9
counter 11.6
blackboard 11.6
life 11.5
indoors 11.4
worker 11.2
building 10.9
house 10.9
lifestyle 10.8
restaurant 10.6
job 10.6
adult 10.6
office 10.6
table 10.4
architecture 10.1
modern 9.8
black 9.6
chair 9.6
sitting 9.4
buy 9.4
old 9
salon 9
design 9
home 8.8
happy 8.8
corporate 8.6
brass 8.5
wind instrument 8.5
meeting 8.5
music 8.2
computer 8
customer 7.6
communication 7.5
horizontal 7.5
silhouette 7.4
floor 7.4
holding 7.4
supermarket 7.4
inside 7.4
new 7.3
color 7.2
looking 7.2
night 7.1

Google
created on 2019-03-25

Microsoft
created on 2019-03-25

outdoor 85.6
store 47.1
street 47.1
black and white 43.3
person 19.5
monochrome 12.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 53.2%
Surprised 45.1%
Angry 45.1%
Sad 45.3%
Confused 45.1%
Calm 54%
Happy 45.3%
Disgusted 45%

AWS Rekognition

Age 26-43
Gender Female, 53.6%
Sad 47.1%
Calm 46.5%
Happy 46.8%
Surprised 45.6%
Disgusted 47.8%
Confused 45.5%
Angry 45.8%

AWS Rekognition

Age 49-69
Gender Female, 50.6%
Sad 53.8%
Disgusted 45.1%
Happy 45%
Calm 45.4%
Angry 45.2%
Confused 45.5%
Surprised 45.1%

AWS Rekognition

Age 26-43
Gender Female, 52.9%
Angry 45.2%
Disgusted 45.1%
Sad 46%
Surprised 45.2%
Happy 45.2%
Calm 53%
Confused 45.2%

AWS Rekognition

Age 12-22
Gender Female, 53.9%
Sad 46.1%
Disgusted 45.5%
Calm 46.5%
Happy 49.4%
Confused 45.7%
Surprised 46.1%
Angry 45.7%

AWS Rekognition

Age 38-59
Gender Female, 54.4%
Sad 45.6%
Confused 45.5%
Happy 48.5%
Angry 46.3%
Calm 45.6%
Disgusted 47.9%
Surprised 45.6%

AWS Rekognition

Age 38-59
Gender Female, 50.6%
Sad 45.9%
Angry 46%
Calm 47.4%
Surprised 45.4%
Happy 49.1%
Confused 45.3%
Disgusted 45.8%

AWS Rekognition

Age 20-38
Gender Female, 51.4%
Surprised 45.2%
Confused 45.1%
Calm 46.1%
Disgusted 45.1%
Angry 45.2%
Happy 45.1%
Sad 53.1%

Feature analysis

Amazon

Person 99.5%
Wheel 98.7%
Bicycle 96.5%

Text analysis

Amazon

GRAND
OPaING
OPaING Sale
Sale
OPEY
SPECIRE
SUPED
NoR GRAND
Ramre SPECIRE
RADXO SUPED
Ramre
RADXO
NoR

Google

OPENING GR RADIO
OPENING
GR
RADIO