Human Generated Data

Title

Untitled (interior view of restaurant with employees in background)

Date

1949

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6237

Human Generated Data

Title

Untitled (interior view of restaurant with employees in background)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6237

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Restaurant 99.9
Person 99.5
Human 99.5
Person 98.9
Cafe 97.2
Person 96.8
Cafeteria 95.2
Meal 95
Food 95
Chair 87.6
Furniture 87.6
Food Court 76
Person 72.4
Diner 59.9
Shop 56.4
Lighting 55.2

Clarifai
created on 2023-10-26

monochrome 98.9
furniture 98.6
people 98.4
street 97.5
architecture 97.1
room 96.9
stock 96.6
city 95.4
desk 95.3
table 95.1
indoors 95.1
restaurant 94.4
inside 94.4
chair 93
diner 91.4
seat 90.2
man 90
library 88.5
urban 87.7
group 87.4

Imagga
created on 2022-01-22

center 47.4
counter 36.9
city 27.4
architecture 26.2
building 25.8
interior 23
restaurant 20.7
modern 18.9
urban 18.3
transportation 17.9
night 17.8
transport 16.4
travel 15.5
industry 15.4
office 14
light 14
business 14
street 13.8
room 13.1
furniture 13.1
lights 13
structure 12.9
sky 12.8
steel 12.4
buildings 12.3
cockpit 12.1
inside 12
factory 11.9
industrial 11.8
table 11.4
station 11.2
chair 10.7
equipment 10.5
traffic 10.4
empty 10.3
construction 10.3
bridge 9.5
window 9.4
water 9.3
glass 9.3
town 9.3
house 9.2
car 8.9
decor 8.8
work 8.6
machine 8.6
skyline 8.5
shop 8.5
destination 8.4
dark 8.3
tourism 8.2
vacation 8.2
technology 8.2
metal 8
design 7.9
production 7.8
public 7.8
train 7.7
heavy 7.6
lamp 7.6
wood 7.5
evening 7.5
barroom 7.4
vehicle 7.4
indoor 7.3
new 7.3
decoration 7.2
landmark 7.2
ship 7.2

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

indoor 97.6
black and white 95.3
text 94
building 67.1
table 67.1
furniture 56.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-29
Gender Female, 85.7%
Calm 93.2%
Happy 2.3%
Fear 2.2%
Sad 1.5%
Confused 0.3%
Disgusted 0.2%
Angry 0.1%
Surprised 0.1%

AWS Rekognition

Age 20-28
Gender Male, 60.6%
Sad 42.2%
Happy 40.9%
Confused 7.5%
Disgusted 2.6%
Surprised 2.1%
Fear 2%
Angry 1.7%
Calm 1%

AWS Rekognition

Age 28-38
Gender Male, 58.6%
Calm 42.7%
Happy 34.4%
Sad 15.6%
Fear 3.4%
Surprised 1.1%
Angry 1.1%
Confused 0.9%
Disgusted 0.7%

Feature analysis

Amazon

Person 99.5%
Chair 87.6%

Categories

Captions

Microsoft
created on 2022-01-22

a view of a kitchen 69.3%
a view of a restaurant 69.2%
a view of the inside of a restaurant 69.1%

Text analysis

Amazon

VAGOY

Google

TOTR
TOTR