Human Generated Data

Title

Untitled (grocery store event with woman in costume serving food)

Date

1955-1957

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6443

Human Generated Data

Title

Untitled (grocery store event with woman in costume serving food)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1955-1957

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6443

Machine Generated Data

Tags

Amazon
created on 2019-03-22

Human 99.1
Person 99.1
Person 99
Person 98.7
Person 98.2
Person 97.9
Indoors 97.9
Interior Design 97.9
Person 97.8
Restaurant 94.5
Person 94.4
Hat 91.9
Apparel 91.9
Clothing 91.9
Monitor 88.5
Screen 88.5
Display 88.5
Electronics 88.5
Person 87.1
Person 84.7
Cafeteria 82.9
LCD Screen 80.4
Crowd 65.9
People 65.8
Meal 65.4
Food 65.4
Cafe 62.4
Urban 61.8
Person 60.5
Television 59.6
TV 59.6
Chair 57.1
Furniture 57.1
City 56.4
Building 56.4
Town 56.4
Musical Instrument 56.2
Musician 56.2
Food Court 55.4

Clarifai
created on 2019-03-22

people 99.7
many 97.4
group 97.2
man 96.2
group together 96.2
street 94.2
woman 93
adult 92.7
crowd 92.3
monochrome 87.6
music 85.4
child 81.4
room 77.1
several 75.8
education 74.3
airport 74.3
city 73.7
indoors 73.7
war 73.6
exhibition 72.9

Imagga
created on 2019-03-22

man 26.2
restaurant 24
building 23.7
people 21.7
business 20
interior 19.5
work 16.5
indoors 15.8
center 15.7
chair 15.2
shop 14.8
men 14.6
musical instrument 14.2
light 14
modern 14
group 13.7
male 13.5
transportation 13.4
person 13.3
structure 13.3
equipment 12.9
office 12.8
indoor 12.8
factory 12.6
adult 12.3
urban 12.2
computer 12.1
industry 12
industrial 11.8
architecture 11.7
device 11.4
barbershop 11.4
percussion instrument 11.1
inside 11
machine 10.8
city 10.8
steel drum 10.7
passenger 10.2
room 10.1
stage 10
worker 9.9
travel 9.9
job 9.7
steel 9.7
working 9.7
station 9.7
black 9.6
corporate 9.4
meeting 9.4
power 9.2
music 9
lifestyle 8.7
engine 8.7
glass 8.6
sitting 8.6
house 8.4
silhouette 8.3
platform 8.1
mercantile establishment 8
gate 8
life 8
employee 7.9
departure 7.9
table 7.8
train 7.7
window 7.5
hot 7.5
occupation 7.3
design 7.3
transport 7.3
metal 7.2
women 7.1

Google
created on 2019-03-22

Microsoft
created on 2019-03-22

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-44
Gender Male, 53.9%
Happy 46.3%
Surprised 45.1%
Calm 51.4%
Angry 45.2%
Confused 45.2%
Sad 46.6%
Disgusted 45.1%

AWS Rekognition

Age 35-52
Gender Female, 50.4%
Surprised 49.5%
Angry 49.6%
Confused 49.6%
Calm 50%
Sad 49.7%
Happy 49.6%
Disgusted 49.6%

AWS Rekognition

Age 15-25
Gender Male, 52.2%
Disgusted 45.2%
Happy 45.2%
Calm 52.7%
Sad 46.1%
Angry 45.3%
Confused 45.2%
Surprised 45.2%

AWS Rekognition

Age 20-38
Gender Female, 50.3%
Disgusted 49.5%
Angry 49.6%
Confused 49.5%
Sad 49.7%
Happy 49.6%
Calm 50%
Surprised 49.6%

AWS Rekognition

Age 48-68
Gender Female, 53%
Angry 45.5%
Surprised 45.2%
Disgusted 45.3%
Sad 51.1%
Calm 47.3%
Happy 45.2%
Confused 45.4%

Feature analysis

Amazon

Person 99.1%
Hat 91.9%

Text analysis

Amazon

LOSt
3sV9
Trbe

Google

SUPERIOR
SUPERIOR