Human Generated Data

Title

Untitled (African American women at home show)

Date

1947

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2737

Human Generated Data

Title

Untitled (African American women at home show)

People

Artist: Harry Annas, American 1897 - 1980

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2737

Machine Generated Data

Tags

Amazon
created on 2022-01-16

Person 98.1
Human 98.1
Person 96.4
Person 95.8
Interior Design 95.3
Indoors 95.3
Person 91.7
Person 89.3
Person 88.8
Furniture 87.1
Person 84.8
Person 83
Person 82
Room 79.2
Person 76.1
Plant 74.7
Person 74.2
Person 72.1
People 70
Person 68.9
Flower 66.3
Blossom 66.3
Meal 65.9
Food 65.9
Person 65.5
Person 65.2
Living Room 62.4
Person 62.3
Person 61.7
Bed 58.6
Crowd 57.9
Screen 56.9
Electronics 56.9

Clarifai
created on 2023-10-26

people 99.9
group 97.4
monochrome 97.2
group together 95.1
man 94.9
child 94
many 93.8
woman 93.3
indoors 92.4
adult 91.9
chair 91.3
education 90.8
room 87.6
administration 87
family 86.1
sit 85.8
furniture 85.3
music 84.7
war 84.7
crowd 81.1

Imagga
created on 2022-01-16

blackboard 69.8
classroom 33.9
stage 32
teacher 24.6
room 23.9
platform 23.5
people 21.7
student 19.7
person 19.2
education 17.3
man 16.8
school 16.7
male 14.9
black 14.4
business 13.3
night 13.3
class 12.5
men 12
adult 11.4
light 10.7
building 10.6
group 10.5
educator 10.2
happy 10
modern 9.8
chalkboard 9.8
old 9.7
teaching 9.7
hall 9.6
women 9.5
college 9.5
study 9.3
hand 9.1
board 9
design 9
science 8.9
math 8.8
party 8.6
architecture 8.6
professional 8.4
horizontal 8.4
house 8.3
university 8.3
human 8.2
music 8.1
urban 7.9
mathematics 7.8
portrait 7.8
youth 7.7
exam 7.7
learn 7.5
senior 7.5
silhouette 7.4
cheerful 7.3
smile 7.1
interior 7.1
businessman 7.1
travel 7

Google
created on 2022-01-16

Table 84.9
Black-and-white 84.1
Chair 83.9
Window 80.5
Adaptation 79.2
Art 76.5
Event 74.3
Snapshot 74.3
Monochrome 74.1
Monochrome photography 73.9
Font 68.5
Crowd 65.5
Building 64.7
Class 63.6
Room 63.2
Rectangle 63.1
History 61.5
Visual arts 53.6
Child 53.4

Microsoft
created on 2022-01-16

text 96.3
person 90.5
table 73.6
clothing 65.8
people 63.9
store 39.7
shop 14.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 16-22
Gender Male, 81.1%
Sad 40.8%
Calm 23.8%
Fear 16.6%
Confused 9.1%
Angry 4.8%
Surprised 3%
Happy 1%
Disgusted 0.9%

AWS Rekognition

Age 23-31
Gender Female, 89%
Calm 89.6%
Sad 7.7%
Happy 1.7%
Confused 0.3%
Surprised 0.2%
Fear 0.2%
Angry 0.2%
Disgusted 0.1%

AWS Rekognition

Age 26-36
Gender Female, 95.7%
Happy 48.3%
Calm 14.8%
Fear 13.3%
Surprised 7.2%
Sad 4.8%
Angry 4.5%
Disgusted 4.2%
Confused 2.8%

AWS Rekognition

Age 21-29
Gender Female, 99.3%
Calm 80.2%
Sad 14.9%
Happy 1.6%
Surprised 0.9%
Confused 0.9%
Fear 0.5%
Disgusted 0.5%
Angry 0.5%

AWS Rekognition

Age 20-28
Gender Female, 99.7%
Happy 91.3%
Calm 5%
Sad 2%
Angry 0.4%
Surprised 0.4%
Confused 0.3%
Disgusted 0.3%
Fear 0.3%

AWS Rekognition

Age 12-20
Gender Female, 95.5%
Happy 61.6%
Calm 21.8%
Sad 13.7%
Confused 1.1%
Surprised 0.5%
Angry 0.5%
Disgusted 0.4%
Fear 0.4%

Feature analysis

Amazon

Person 98.1%

Text analysis

Amazon

EEE
(ipex
736
الانيت

Google

KODVK- VEEIA
KODVK-
VEEIA