Human Generated Data

Title

Untitled (auto show, large crowd of spectators looking at cars)

Date

1953

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13518

Human Generated Data

Title

Untitled (auto show, large crowd of spectators looking at cars)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

1953

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-04

Person 99.5
Human 99.5
Person 99.2
Person 98.8
Person 94.6
Person 94.1
Person 93.9
Person 92.2
Automobile 90.3
Vehicle 90.3
Transportation 90.3
Person 89.7
Sports Car 89.5
Car 89.2
Car 88.4
Person 87.3
Person 82.6
Person 82.5
Person 74.3
Tire 73.6
Person 73.3
Machine 72.5
Person 71.5
Wheel 71
Person 70.9
Wheel 70.6
Coupe 69.9
Spoke 68.2
Car Wheel 67.1
Overcoat 65.9
Clothing 65.9
Apparel 65.9
Coat 65.9
Suit 65.5
Building 63.9
Text 58.1
Alloy Wheel 57.6
Race Car 57.1
Face 57.1
People 56.3
Person 42.6

Imagga
created on 2022-02-04

barbershop 100
shop 85.5
mercantile establishment 66.6
place of business 44.4
city 24.9
transportation 24.2
establishment 22
building 21.5
travel 20.4
urban 20.1
business 18.8
modern 18.2
architecture 18
chair 17.9
car 17.9
transport 17.3
room 16.3
road 16.3
office 15.9
street 14.7
vehicle 14.5
seat 14.5
traffic 14.2
structure 13
sky 12.7
highway 12.5
interior 12.4
center 11.5
salon 11.4
empty 11.2
train 10.6
automobile 10.5
auto 10.5
furniture 10.4
industry 10.2
speed 10.1
light 10
steel 9.7
cityscape 9.5
drive 9.5
town 9.3
power 9.2
tourism 9.1
landscape 8.9
high 8.7
table 8.7
work 8.6
motion 8.6
construction 8.6
black 8.4
fast 8.4
horizontal 8.4
house 8.4
station 8.3
vacation 8.2
working 8
indoors 7.9
rush 7.9
equipment 7.8
nobody 7.8
lamp 7.6
cinema 7.6
journey 7.5
place 7.4
environment 7.4
inside 7.4
people 7.2
barber chair 7.2
river 7.1
night 7.1

Google
created on 2022-02-04

Microsoft
created on 2022-02-04

text 97.7
land vehicle 94.7
vehicle 94.3
person 90.1
outdoor 89.2
wheel 82
car 80.3
black and white 78.8
white 63.9

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 95.5%
Calm 99.9%
Surprised 0%
Sad 0%
Angry 0%
Happy 0%
Confused 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 20-28
Gender Female, 73.7%
Happy 94.3%
Sad 4%
Calm 0.7%
Angry 0.2%
Fear 0.2%
Surprised 0.2%
Disgusted 0.2%
Confused 0.1%

AWS Rekognition

Age 18-26
Gender Male, 56.6%
Sad 49.5%
Calm 41.2%
Confused 3.2%
Happy 2.4%
Angry 1.1%
Disgusted 1%
Surprised 1%
Fear 0.8%

AWS Rekognition

Age 23-33
Gender Female, 83.2%
Sad 66.7%
Calm 11.2%
Confused 7.9%
Fear 6.7%
Happy 4%
Disgusted 1.5%
Angry 1.2%
Surprised 0.9%

AWS Rekognition

Age 6-14
Gender Female, 59%
Sad 49.4%
Calm 36.9%
Confused 9.2%
Angry 2.4%
Happy 0.9%
Fear 0.6%
Disgusted 0.4%
Surprised 0.2%

AWS Rekognition

Age 21-29
Gender Female, 59.4%
Calm 98.6%
Confused 0.7%
Sad 0.4%
Surprised 0.1%
Disgusted 0.1%
Happy 0%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.5%
Car 89.2%
Wheel 71%

Captions

Microsoft

a group of people standing in front of a building 85.3%
a group of people standing in front of a car 64.2%
a group of people standing in front of a crowd 64.1%

Text analysis

Amazon

TODAY
HERE
DARRIN
1
CHISUM
KODAK
THE
SAFETY KODAK
WELCOME
SAFETY
THE Henry J
KAISER
Henry J
WATER

Google

HERE
WEICOME
WEICOME CHISUM HERE TODAY
TODAY
CHISUM