Human Generated Data

Title

Untitled (people in waiting room)

Date

1949

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20167

Human Generated Data

Title

Untitled (people in waiting room)

People

Artist: Peter James Studio, American

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20167

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.8
Human 98.8
Person 96.1
Plant 87.9
Person 87.9
Flower 86.3
Blossom 86.3
Flower Arrangement 78.8
Flooring 78.5
Art 76.6
Clothing 76
Apparel 76
Sitting 70.5
Flower Bouquet 68.7
Person 62.4
Interior Design 60
Indoors 60
Ikebana 58.5
Jar 58.5
Ornament 58.5
Vase 58.5
Pottery 58.5
Furniture 58.4

Clarifai
created on 2023-10-22

people 99.5
monochrome 99.4
indoors 96.7
man 96.5
chair 96.5
furniture 96.4
woman 95.8
group 95.1
wedding 94.7
adult 93.8
girl 92.1
couple 91.4
two 90.8
mirror 90
portrait 89.9
family 89.9
room 89.3
actor 87.6
model 87.2
seat 86.9

Imagga
created on 2022-03-05

man 17.1
groom 16.7
person 16.2
people 16.2
business 15.2
room 14.5
businessman 14.1
male 13.5
bouquet 13.2
office 13
group 12.1
success 12.1
corporate 12
teamwork 11.1
professional 11.1
work 11
team 10.8
light 10.7
decoration 10.4
men 10.3
3d 10.1
paper 10
automaton 9.9
negative 9.8
idea 9.8
device 9
worker 8.9
style 8.9
home 8.9
interior 8.8
film 8.8
meeting 8.5
screen 8.4
modern 8.4
successful 8.2
businesswoman 8.2
suit 8.1
love 7.9
toilet tissue 7.8
black 7.8
table 7.8
sitting 7.7
hand 7.6
art 7.5
happy 7.5
wedding 7.4
lifestyle 7.2
tissue 7.2
adult 7.2
celebration 7.2
flower arrangement 7.1
portrait 7.1
job 7.1
together 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 82%
Happy 40.5%
Confused 18%
Calm 17.4%
Disgusted 8.9%
Surprised 5.1%
Angry 4.8%
Sad 4.7%
Fear 0.7%

AWS Rekognition

Age 38-46
Gender Male, 98.6%
Calm 99.8%
Sad 0.1%
Confused 0%
Angry 0%
Surprised 0%
Happy 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 31-41
Gender Female, 72%
Calm 100%
Sad 0%
Fear 0%
Happy 0%
Disgusted 0%
Surprised 0%
Confused 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.8%
Person 96.1%
Person 87.9%
Person 62.4%

Categories

Imagga

interior objects 98.6%

Text analysis

Amazon

as