Human Generated Data

Title

Untitled (people in supermarket watching safety film, view from front)

Date

1940

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4166

Human Generated Data

Title

Untitled (people in supermarket watching safety film, view from front)

People

Artist: Durette Studio, American 20th century

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4166

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 99.3
Human 99.3
Restaurant 99.1
Person 98.4
Person 98.3
Person 97.8
Cafeteria 90.8
Person 88.5
Person 86.7
Meal 80.2
Food 80.2
Chair 77.9
Furniture 77.9
Automobile 73.3
Transportation 73.3
Vehicle 73.3
Car 73.3
Cafe 72.1
Food Court 71.3
People 66.2
Shop 64
Accessories 61.5
Accessory 61.5
Tie 61.5

Clarifai
created on 2019-06-01

people 99.8
monochrome 99
adult 98
group together 97.7
many 96.2
group 95.9
man 95.3
woman 92.8
several 86.5
furniture 84.7
administration 83.8
child 83.1
vehicle 82.2
crowd 78.9
street 77.8
transportation system 77.5
stock 76.8
sit 76.4
indoors 73.1
leader 72.8

Imagga
created on 2019-06-01

interior 23.9
business 23.7
negative 21.7
architecture 20.3
sketch 19.4
house 19.2
case 18.5
film 18.4
room 18.3
modern 18.2
home 17.5
drawing 17.3
design 16.9
building 16.3
office 14.8
technology 14.8
work 14.1
equipment 14.1
construction 13.7
photographic paper 13.5
newspaper 13.5
glass 13.3
table 13.3
finance 12.7
architect 12.5
furniture 12.1
floor 12.1
people 11.7
product 11.7
stall 11.6
science 11.6
apartment 11.5
medical 11.5
medicine 11.4
plan 11.3
money 11.1
city 10.8
creation 10.7
plaything 10.7
bank 10.6
clean 10
toy 9.5
paper 9.5
industry 9.4
dollar 9.3
galley 9.1
photographic equipment 9
currency 9
hospital 8.9
digital 8.9
hall 8.8
lab 8.7
laboratory 8.7
project 8.7
appliance 8.6
research 8.6
window 8.5
3d 8.5
hand 8.4
sign 8.3
investment 8.2
kitchen 8.2
group 8.1
success 8
businessman 7.9
structure 7.9
white goods 7.8
chemical 7.8
chemistry 7.7
professional 7.7
restaurant 7.6
exchange 7.6
vessel 7.6
doctor 7.5
person 7.5
representation 7.5
inside 7.4
banking 7.4
indoor 7.3
new 7.3
color 7.2
computer 7.2
team 7.2
bright 7.1
refrigerator 7.1
copy 7.1
steel 7.1
working 7.1
indoors 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

clothing 91.4
person 90.1
billboard 63.8
black and white 63.4
man 62.2
shop 25.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 51%
Calm 48.8%
Disgusted 45.5%
Confused 45.5%
Surprised 45.5%
Happy 47.1%
Sad 46.7%
Angry 45.8%

AWS Rekognition

Age 26-43
Gender Female, 53.7%
Angry 45.2%
Surprised 45.1%
Disgusted 45.1%
Sad 45.9%
Calm 45.1%
Happy 53.4%
Confused 45.1%

AWS Rekognition

Age 35-52
Gender Female, 54%
Confused 46.5%
Surprised 45.4%
Calm 48.4%
Sad 49.2%
Happy 45.2%
Disgusted 45.1%
Angry 45.1%

AWS Rekognition

Age 38-59
Gender Male, 51.2%
Confused 45.6%
Happy 47.4%
Calm 46.8%
Disgusted 46.1%
Sad 46%
Angry 46.9%
Surprised 46.1%

AWS Rekognition

Age 26-43
Gender Female, 50.4%
Happy 49.6%
Disgusted 49.6%
Angry 49.6%
Calm 49.6%
Sad 50.1%
Confused 49.5%
Surprised 49.6%

AWS Rekognition

Age 26-43
Gender Female, 50.5%
Confused 49.6%
Surprised 49.5%
Happy 49.5%
Angry 49.5%
Sad 49.9%
Disgusted 49.5%
Calm 49.9%

AWS Rekognition

Age 26-43
Gender Female, 54.5%
Happy 47.6%
Surprised 45.7%
Calm 46.9%
Sad 46.1%
Disgusted 47.8%
Angry 45.6%
Confused 45.3%

AWS Rekognition

Age 14-25
Gender Male, 54.4%
Happy 45.3%
Calm 46.8%
Angry 45.3%
Surprised 45.4%
Disgusted 51.1%
Confused 45.5%
Sad 45.6%

AWS Rekognition

Age 26-43
Gender Male, 50.3%
Happy 49.6%
Calm 49.6%
Angry 49.5%
Confused 49.5%
Surprised 49.5%
Sad 50.1%
Disgusted 49.6%

AWS Rekognition

Age 26-43
Gender Female, 50.2%
Sad 49.7%
Surprised 49.5%
Disgusted 49.6%
Angry 49.5%
Calm 50%
Happy 49.6%
Confused 49.6%

AWS Rekognition

Age 20-38
Gender Female, 50.2%
Happy 49.6%
Surprised 49.6%
Angry 49.6%
Confused 49.6%
Calm 49.9%
Sad 49.8%
Disgusted 49.6%

AWS Rekognition

Age 26-43
Gender Female, 50.5%
Sad 49.6%
Angry 49.6%
Surprised 49.6%
Calm 49.6%
Disgusted 49.9%
Happy 49.7%
Confused 49.6%

AWS Rekognition

Age 26-43
Gender Female, 50.5%
Surprised 49.7%
Sad 49.6%
Happy 49.5%
Angry 49.6%
Disgusted 49.5%
Confused 49.7%
Calm 49.9%

AWS Rekognition

Age 26-44
Gender Female, 50.4%
Sad 49.7%
Calm 50%
Surprised 49.5%
Angry 49.6%
Disgusted 49.6%
Happy 49.6%
Confused 49.5%

AWS Rekognition

Age 20-38
Gender Female, 53.9%
Sad 45.5%
Confused 45.3%
Disgusted 47.6%
Surprised 45.4%
Angry 45.4%
Happy 46.7%
Calm 49.1%

AWS Rekognition

Age 23-38
Gender Female, 53.7%
Angry 45.4%
Happy 47.9%
Confused 45.5%
Calm 48.5%
Surprised 45.8%
Sad 46.7%
Disgusted 45.2%

AWS Rekognition

Age 27-44
Gender Female, 50.4%
Disgusted 49.6%
Sad 49.6%
Surprised 49.6%
Happy 49.9%
Angry 49.6%
Calm 49.7%
Confused 49.6%

AWS Rekognition

Age 20-38
Gender Female, 50.4%
Angry 49.7%
Sad 49.6%
Disgusted 49.7%
Happy 49.6%
Calm 49.7%
Surprised 49.7%
Confused 49.6%

AWS Rekognition

Age 20-38
Gender Female, 50.2%
Angry 49.5%
Happy 49.6%
Confused 49.5%
Calm 50.3%
Disgusted 49.6%
Sad 49.5%
Surprised 49.5%

AWS Rekognition

Age 26-43
Gender Female, 50.5%
Sad 49.6%
Angry 49.6%
Calm 49.7%
Confused 49.5%
Disgusted 49.8%
Happy 49.6%
Surprised 49.6%

AWS Rekognition

Age 26-43
Gender Male, 50.1%
Surprised 49.5%
Sad 49.5%
Angry 49.5%
Disgusted 50%
Calm 49.8%
Happy 49.6%
Confused 49.5%

AWS Rekognition

Age 14-23
Gender Female, 50.5%
Disgusted 49.9%
Angry 49.6%
Confused 49.5%
Sad 49.5%
Happy 49.6%
Calm 49.7%
Surprised 49.6%

Feature analysis

Amazon

Person 99.3%
Chair 77.9%
Car 73.3%
Tie 61.5%

Categories

Imagga

paintings art 61.1%
interior objects 28.7%
text visuals 9.2%

Text analysis

Amazon

SPECIALTHIS
HAMS
SPECIALTHIS W
OYSTEKS
W
FEET
PICS
bune
TCHOPS
-HHAA
RIOE OYSTEKS
Pond
RIOE
MASI
FEE

Google

SPECIAL THIS FEETHAMS
SPECIAL
THIS
FEETHAMS