Human Generated Data

Title

Untitled (two photographs: male employees posing in front of shoe boxes in store; female employees behind counter in handbag department)

Date

c. 1950, printed later

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13824

Human Generated Data

Title

Untitled (two photographs: male employees posing in front of shoe boxes in store; female employees behind counter in handbag department)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13824

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Transportation 99.6
Vehicle 99.6
Truck 99.6
Person 99.4
Human 99.4
Person 99.3
Person 99.2
Person 98.5
Poster 98.3
Advertisement 98.3
Collage 98.3
Person 98.2
Person 98
Person 96.3
Person 96.2
Person 93
Person 90
Wheel 88.9
Machine 88.9
Tire 87.1
Automobile 83.7
Car 83.7
Person 77.7
Car Wheel 77.5
Truck 75.8
Electronics 74.7
Screen 74.7
Spoke 68.1
Person 66.1
Display 65
Monitor 65
Wheel 58.9
Person 58.2
Wheel 51.6

Clarifai
created on 2019-11-16

people 98.9
street 98.3
transportation system 97.8
vehicle 97.3
airport 96.7
monochrome 95.9
group together 94.1
man 92.2
train 91.6
airplane 90.7
group 89.5
aircraft 87.9
city 85.6
subway system 84.1
locomotive 84
two 83.4
many 83
railway 82
light 81
adult 77.2

Imagga
created on 2019-11-16

billboard 46.2
signboard 36.7
structure 29.5
car 22.6
transportation 22.4
city 21.6
travel 21.1
transport 18.2
urban 17.5
sky 16.6
device 16
architecture 15.6
vehicle 14.5
road 14.4
equipment 13.7
light 13.4
speed 12.8
business 12.7
office 12.6
old 12.5
radio 12.1
building 11.9
highway 11.6
tourism 11.5
automobile 11.5
truck 11.4
electronic equipment 10.8
clouds 10.1
street 10.1
people 10
vacation 9.8
traffic 9.5
motion 9.4
famous 9.3
landmark 9
technology 8.9
river 8.9
lamp 8.7
scene 8.6
motor vehicle 8.5
broadcasting 8.5
fast 8.4
black 8.4
modern 8.4
entertainment 8.3
station 8.3
retro 8.2
airport 8.2
tourist 8.1
water 8
space 7.7
driving 7.7
industry 7.7
auto 7.6
projector 7.5
vintage 7.4
night 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

vehicle 98.1
land vehicle 95.2
car 94.6
wheel 88.2
text 82.9
white 82.5
black and white 71.8
appliance 57.3
old 41.4
kitchen appliance 19.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 16-28
Gender Female, 54.3%
Calm 45.2%
Happy 54.4%
Angry 45.1%
Disgusted 45.1%
Fear 45.1%
Sad 45%
Confused 45.1%
Surprised 45.1%

AWS Rekognition

Age 32-48
Gender Female, 54.9%
Angry 45.4%
Disgusted 45.2%
Fear 45.3%
Sad 45.8%
Happy 52.7%
Calm 45.1%
Confused 45.2%
Surprised 45.2%

AWS Rekognition

Age 4-12
Gender Female, 53.1%
Angry 45%
Sad 54.6%
Calm 45.2%
Fear 45.1%
Happy 45.1%
Surprised 45%
Disgusted 45%
Confused 45%

AWS Rekognition

Age 19-31
Gender Male, 53.3%
Sad 45.1%
Confused 45%
Fear 45%
Happy 45.1%
Surprised 45%
Angry 45%
Calm 54.8%
Disgusted 45%

AWS Rekognition

Age 25-39
Gender Male, 50.5%
Disgusted 49.5%
Happy 49.5%
Angry 49.5%
Confused 49.6%
Calm 50.3%
Fear 49.5%
Sad 49.5%
Surprised 49.5%

AWS Rekognition

Age 22-34
Gender Male, 50.4%
Angry 49.6%
Happy 49.5%
Fear 49.5%
Disgusted 49.5%
Sad 49.8%
Calm 50%
Surprised 49.5%
Confused 49.5%

AWS Rekognition

Age 22-34
Gender Female, 50.5%
Confused 49.5%
Surprised 49.5%
Sad 49.5%
Calm 49.5%
Disgusted 49.5%
Happy 50.5%
Fear 49.5%
Angry 49.5%

AWS Rekognition

Age 22-34
Gender Female, 51.4%
Happy 47.1%
Sad 45.2%
Confused 45.2%
Disgusted 45.2%
Angry 45.2%
Surprised 45.3%
Calm 51.7%
Fear 45.1%

AWS Rekognition

Age 39-57
Gender Male, 50.5%
Surprised 49.5%
Fear 49.7%
Angry 49.5%
Disgusted 49.5%
Sad 49.7%
Calm 50.1%
Happy 49.5%
Confused 49.5%

AWS Rekognition

Age 22-34
Gender Male, 50.4%
Fear 49.5%
Disgusted 49.5%
Calm 49.6%
Angry 49.5%
Sad 49.5%
Surprised 49.5%
Happy 50.3%
Confused 49.5%

AWS Rekognition

Age 26-42
Gender Male, 50.5%
Calm 49.7%
Surprised 49.6%
Disgusted 49.5%
Happy 50%
Angry 49.6%
Confused 49.5%
Sad 49.6%
Fear 49.6%

AWS Rekognition

Age 36-54
Gender Male, 50.5%
Calm 49.5%
Angry 49.6%
Surprised 49.6%
Confused 49.5%
Disgusted 49.7%
Happy 49.5%
Sad 49.7%
Fear 49.7%

AWS Rekognition

Age 22-34
Gender Male, 50.4%
Fear 49.5%
Calm 49.5%
Angry 49.5%
Happy 50.5%
Confused 49.5%
Disgusted 49.5%
Sad 49.5%
Surprised 49.5%

AWS Rekognition

Age 36-52
Gender Male, 50.4%
Surprised 49.5%
Calm 49.6%
Fear 49.5%
Disgusted 49.6%
Happy 50.1%
Angry 49.5%
Sad 49.6%
Confused 49.5%

Feature analysis

Amazon

Truck 99.6%
Person 99.4%
Wheel 88.9%

Text analysis

Amazon

99
G:CHOSTERY
Hanayg 99
Hanayg
00000909

Google

ERY FLAMMABLE- COMPRESSID GAS4
ERY
FLAMMABLE-
COMPRESSID
GAS4