Human Generated Data

Title

Untitled (crowded schools)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16061

Human Generated Data

Title

Untitled (crowded schools)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16061

Machine Generated Data

Tags

Amazon
created on 2022-03-25

Person 99.7
Human 99.7
Person 99.7
Person 99.5
Person 99.3
Person 97.7
Person 96.3
Shopping 95.6
Clothing 74.3
Apparel 74.3
Female 71.2
Person 70
Bag 59.1
People 57.4
Woman 56.4
Cafeteria 55.5
Restaurant 55.5

Clarifai
created on 2023-10-29

people 99.9
group 99.4
woman 98.3
adult 98
man 97.7
group together 97.6
monochrome 95.9
child 93.1
education 91.4
indoors 89.5
four 89
school 88.8
three 88.5
queue 83.9
teacher 82.8
five 82.6
elementary school 82.4
adolescent 79.2
several 78.1
side view 77.7

Imagga
created on 2022-03-25

barbershop 100
shop 84.3
mercantile establishment 61.6
hairdresser 45.5
man 43.7
place of business 41
people 36.8
male 31.9
person 28.4
office 23.8
adult 23.5
business 22.5
establishment 20.5
working 20.3
businessman 20.3
professional 19.9
corporate 19.8
happy 19.4
indoors 18.4
room 17.8
home 17.5
men 16.3
looking 16
executive 15.7
computer 15.2
job 15
patient 14.7
meeting 14.1
work 14.1
education 13.9
smiling 13.7
to 13.3
lifestyle 13
portrait 12.9
hospital 12.9
group 12.9
sitting 12.9
laptop 12.8
businesswoman 12.7
medical 12.4
doctor 12.2
couple 12.2
hand 12.1
communication 11.8
interior 11.5
businesspeople 11.4
career 11.4
indoor 11
worker 10.8
holding 10.7
family 10.7
modern 10.5
together 10.5
table 10.4
occupation 10.1
horizontal 10
team 9.9
medicine 9.7
technology 9.6
standing 9.6
women 9.5
adults 9.5
desk 9.4
happiness 9.4
manager 9.3
mature 9.3
teamwork 9.3
clinic 9.3
back 9.2
human 9
cheerful 8.9
color 8.9
smile 8.6
teacher 8.5
casual 8.5
senior 8.4
screen 8.4
house 8.4
chair 8.3
health 8.3
successful 8.2
care 8.2
board 8.1
monitor 8.1
handsome 8
attractive 7.7
life 7.7
two 7.6
talking 7.6
classroom 7.4
inside 7.4
confident 7.3
student 7.2
black 7.2
child 7

Google
created on 2022-03-25

Microsoft
created on 2022-03-25

person 99.8
clothing 97.5
man 86
text 84.6
group 70.6
people 63.8
woman 63.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 16-22
Gender Male, 80.8%
Calm 50.1%
Surprised 27.3%
Fear 7.7%
Angry 5.2%
Confused 4.1%
Sad 3.3%
Disgusted 1.2%
Happy 1%

AWS Rekognition

Age 13-21
Gender Female, 90.3%
Calm 82%
Sad 15.3%
Fear 0.8%
Angry 0.6%
Confused 0.6%
Disgusted 0.4%
Happy 0.2%
Surprised 0.2%

AWS Rekognition

Age 25-35
Gender Male, 100%
Calm 99.6%
Confused 0.2%
Sad 0.1%
Angry 0.1%
Happy 0%
Surprised 0%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 13-21
Gender Female, 96%
Calm 98.6%
Surprised 0.9%
Confused 0.3%
Sad 0.1%
Disgusted 0%
Angry 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 14-22
Gender Male, 70.4%
Calm 93.6%
Surprised 3.4%
Sad 1%
Confused 0.6%
Angry 0.5%
Disgusted 0.4%
Happy 0.2%
Fear 0.2%

AWS Rekognition

Age 16-24
Gender Female, 99.9%
Calm 76.9%
Angry 22.2%
Sad 0.3%
Disgusted 0.2%
Confused 0.2%
Fear 0.1%
Surprised 0.1%
Happy 0%

AWS Rekognition

Age 20-28
Gender Female, 99.6%
Calm 91%
Angry 4.5%
Sad 2.6%
Fear 0.8%
Confused 0.5%
Surprised 0.3%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 29-39
Gender Female, 99.7%
Sad 77.2%
Fear 10.1%
Angry 4.1%
Calm 2.3%
Surprised 2.2%
Disgusted 2%
Confused 1.1%
Happy 1%

Microsoft Cognitive Services

Age 28
Gender Male

Microsoft Cognitive Services

Age 22
Gender Male

Microsoft Cognitive Services

Age 9
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.7%
Person 99.7%
Person 99.5%
Person 99.3%
Person 97.7%
Person 96.3%
Person 70%

Categories

Imagga

people portraits 98.9%

Text analysis

Amazon

SOCIAL
SOCIAL STUDIES
Home
STUDIES
8
Economics
XAGOX
tirn

Google

MJIF YT 37 A2 AGOX
MJIF
YT
37
A2
AGOX