Human Generated Data

Title

Trolley -- New Orleans

Date

1955-1956

People

Artist: Robert Frank, American 1924 - 2019

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.750

Copyright

© Robert Frank Estate

Human Generated Data

Title

Trolley -- New Orleans

People

Artist: Robert Frank, American 1924 - 2019

Date

1955-1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.750

Copyright

© Robert Frank Estate

Machine Generated Data

Tags

Amazon
created on 2019-04-04

Human 99.3
Person 99.3
Person 99.3
Person 99.3
Person 98.9
Person 98
Person 97.8
Vehicle 88.4
Transportation 88.4
Person 88.3
Food 75
Meal 75
Train 71.1
Kiosk 69.5
Restaurant 57.8
Sitting 57.3

Clarifai
created on 2018-03-23

people 100
group 99.9
group together 99.5
many 99.1
adult 98.8
vehicle 98.8
several 98.6
three 97.9
man 97.9
four 97.8
administration 97.6
war 97.3
woman 96.3
transportation system 95.6
leader 95.3
two 94.5
street 91.6
military 90.6
furniture 90.3
child 90

Imagga
created on 2018-03-23

case 80.3
furniture 32.5
architecture 28.9
building 25.7
old 23
furnishing 21.1
buffet 19.3
design 18.5
interior 17.7
wood 16.7
city 14.9
structure 14.4
frame 14.2
antique 14.2
house 14.2
travel 13.4
wooden 13.2
shop 12.8
vintage 12.6
retro 12.3
film 12.1
ancient 12.1
construction 12
style 11.9
shelf 11.8
window 11.7
art 11.7
history 11.6
urban 11.3
modern 11.2
home 11.2
grunge 11.1
glass 10.9
cabinet 10.7
light 10.7
box 10.5
container 10.4
empty 10.3
historic 10.1
business 9.7
black 9.6
sky 9.6
inside 9.2
indoor 9.1
texture 9
brown 8.8
mercantile establishment 8.7
decoration 8.7
palace 8.7
wall 8.5
historical 8.5
exterior 8.3
transport 8.2
dirty 8.1
digital 8.1
transportation 8.1
decor 7.9
chest 7.9
paper 7.8
hall 7.8
space 7.7
culture 7.7
apartment 7.7
chair 7.6
place 7.4
classic 7.4
church 7.4
table 7.4
entertainment 7.4
water 7.3
room 7.3
aged 7.2
night 7.1

Google
created on 2018-03-23

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 4-7
Gender Female, 53.7%
Disgusted 45%
Happy 45%
Confused 45%
Calm 45%
Angry 45%
Surprised 45%
Sad 54.9%

AWS Rekognition

Age 26-43
Gender Female, 51.7%
Angry 46%
Sad 47.3%
Disgusted 47.1%
Surprised 45.2%
Calm 49%
Happy 45.2%
Confused 45.2%

AWS Rekognition

Age 4-9
Gender Female, 53.9%
Surprised 45%
Disgusted 45%
Happy 45%
Sad 46.2%
Calm 53.5%
Angry 45.2%
Confused 45.1%

AWS Rekognition

Age 35-52
Gender Male, 54.9%
Calm 46.2%
Disgusted 45.9%
Surprised 45.3%
Happy 45.2%
Sad 51.8%
Angry 45.3%
Confused 45.3%

AWS Rekognition

Age 35-53
Gender Male, 54%
Angry 45.4%
Disgusted 45.2%
Sad 45.3%
Confused 45.4%
Calm 52.3%
Happy 45.8%
Surprised 45.6%

Microsoft Cognitive Services

Age 49
Gender Male

Microsoft Cognitive Services

Age 33
Gender Male

Microsoft Cognitive Services

Age 5
Gender Female

Microsoft Cognitive Services

Age 4
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Train 71.1%

Text analysis

Amazon

UGS