Human Generated Data

Title

Untitled (waiting room for advertising agency)

Date

1948

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20053

Human Generated Data

Title

Untitled (waiting room for advertising agency)

People

Artist: Peter James Studio, American

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20053

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.6
Human 99.6
Person 99.5
Person 98.1
Person 96.4
Shoe 95.2
Clothing 95.2
Footwear 95.2
Apparel 95.2
Clinic 92.2
Person 90.9
Person 89.1
Person 79.8
Furniture 79.3
Sitting 78.6
Chair 75
Person 72.6
Person 71.4
Hospital 68.7
Shop 65.8
Person 60.6
Person 60.3
Cafeteria 56.6
Restaurant 56.6
Workshop 56.1
People 55.4

Clarifai
created on 2023-10-22

people 99.8
group together 99.2
room 98.6
furniture 98.6
group 97.6
adult 97.1
man 96.9
indoors 96.8
chair 96.1
woman 94.9
monochrome 94.4
seat 94
three 92.8
two 91.6
recreation 90.2
four 89.8
five 87.9
sit 85.6
several 84.7
child 82.9

Imagga
created on 2022-03-05

barbershop 100
shop 83.9
mercantile establishment 62
place of business 42.2
interior 41.6
salon 37
chair 31.8
room 30
table 26
modern 25.2
furniture 21.5
establishment 21.3
people 20.1
inside 19.3
indoors 18.4
business 18.2
floor 17.7
architecture 17.2
window 16.5
design 16.3
indoor 15.5
building 15.4
home 15.2
urban 14.9
empty 14.6
office 14.6
men 13.7
house 13.4
light 13.4
wood 13.3
restaurant 12.7
gym 12.4
decor 12.4
city 11.6
man 11.4
luxury 11.1
training 11.1
women 11.1
lifestyle 10.8
chairs 10.8
health 10.4
life 10.4
seat 10.4
contemporary 10.3
hairdresser 10.3
work 10.2
glass 10.1
transportation 9.9
equipment 9.8
working 9.7
station 9.7
group 9.7
style 9.6
apartment 9.6
lamp 9.5
decoration 9.4
cafeteria 9.3
train 9
steel 8.8
lighting 8.7
comfortable 8.6
living 8.5
3d 8.5
male 8.5
travel 8.4
elegance 8.4
occupation 8.3
exercise 8.2
kitchen 8
adult 7.8
scene 7.8
exercising 7.7
residential 7.7
mirror 7.6
strength 7.5
stylish 7.2
fitness 7.2
hall 7.1
person 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

indoor 95
furniture 91.8
black and white 90.6
text 87
table 84.4
person 73.2
clothing 50.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 2-10
Gender Female, 79.8%
Sad 86.3%
Calm 8.3%
Disgusted 2%
Fear 1%
Angry 1%
Confused 0.6%
Happy 0.5%
Surprised 0.3%

AWS Rekognition

Age 35-43
Gender Female, 55.5%
Calm 87.8%
Angry 8.3%
Sad 1.5%
Happy 1%
Surprised 0.5%
Fear 0.4%
Disgusted 0.3%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.6%
Person 99.5%
Person 98.1%
Person 96.4%
Person 90.9%
Person 89.1%
Person 79.8%
Person 72.6%
Person 71.4%
Person 60.6%
Person 60.3%
Shoe 95.2%

Categories

Captions

Text analysis

Amazon

ERICKSON
ADVERTISING
MCCANN ERICKSON ADVERTISING
MCCANN

Google

MCCANN ERICKSON ADVERTISING
MCCANN
ERICKSON
ADVERTISING