Human Generated Data

Title

Untitled (Louis Fabian Bachrach and Mrs. Herbert Hoover)

Date

1930s

People

Artist: Bradford Bachrach, American active 1912 - 1993

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14119

Human Generated Data

Title

Untitled (Louis Fabian Bachrach and Mrs. Herbert Hoover)

People

Artist: Bradford Bachrach, American active 1912 - 1993

Date

1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14119

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Furniture 99.8
Person 98.3
Human 98.3
Person 96.4
Chair 89.9
Tripod 86.9
Shoe 80.2
Footwear 80.2
Clothing 80.2
Apparel 80.2
Overcoat 72.5
Coat 67
Canopy 66.5
Photo 65.3
Photography 65.3
Suit 59.2
Portrait 57.3
Face 57.3

Clarifai
created on 2023-10-26

people 99.8
chair 98.9
umbrella 98.8
furniture 97.5
adult 97
two 96.9
woman 96.6
seat 96.4
monochrome 95.7
man 94.8
wear 94.8
street 93.8
portrait 93.3
bench 92.3
wedding 90.7
sit 90.3
leader 84.1
one 82.8
group 82.7
three 80.8

Imagga
created on 2022-01-22

table 20.2
decoration 19.6
interior 17.7
glass 17.6
chair 17.3
wine 17
home 16.7
restaurant 16
dinner 15.1
seller 15
salon 14.8
celebration 14.3
party 13.7
house 13.4
decor 13.3
room 12.9
drink 12.5
dining 12.4
holiday 12.2
inside 12
people 11.7
food 11.2
wedding 11
indoors 10.5
flowers 10.4
luxury 10.3
person 10.2
service 10.2
glasses 10.2
building 10
shop 9.8
style 9.6
man 9.4
event 9.2
champagne 9
urban 8.7
groom 8.7
light 8.7
fancy 8.7
setting 8.7
work 8.6
bouquet 8.6
design 8.4
wood 8.3
furniture 8.1
stall 8.1
day 7.8
napkin 7.8
architecture 7.8
alcohol 7.7
arrangement 7.7
vintage 7.7
sitting 7.7
men 7.7
tree 7.7
old 7.7
marriage 7.6
traditional 7.5
barbershop 7.5
musical instrument 7.3
mercantile establishment 7.1
meal 7.1
modern 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 93.7
clothing 93.5
musical instrument 92.1
black and white 86.2
man 72.8
text 72.7
furniture 17.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 34-42
Gender Female, 74.8%
Calm 95.4%
Confused 3.7%
Happy 0.3%
Sad 0.2%
Surprised 0.1%
Fear 0.1%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 38-46
Gender Male, 99.5%
Happy 99.8%
Surprised 0.1%
Calm 0%
Confused 0%
Angry 0%
Disgusted 0%
Fear 0%
Sad 0%

Microsoft Cognitive Services

Age 43
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%
Chair 89.9%
Shoe 80.2%
Coat 67%

Categories

Imagga

paintings art 95.5%
interior objects 1.3%

Text analysis

Amazon

BRADFORD
BRADFORD BACHRACH
BACHRACH

Google

BRADFORI) BACHRACH
BRADFORI)
BACHRACH