Human Generated Data

Title

Untitled (bishop leaning to speak to woman seated in church pew)

Date

1957

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Human Generated Data

Title

Untitled (bishop leaning to speak to woman seated in church pew)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1957

Classification

Photographs

Machine Generated Data

Tags

Amazon

Person 99.4
Human 99.4
Clothing 97.7
Apparel 97.7
Person 96.1
Face 92
Furniture 84.7
Chair 84.7
Coat 73.9
Overcoat 73.9
Suit 73.9
People 71.6
Female 71.4
Person 69.7
Performer 67.5
Sitting 67.1
Person 66.1
Portrait 64.7
Photo 64.7
Photography 64.7
Sleeve 62.4
Woman 61.1
Crowd 60.1
Smile 58.6
Worker 57
Long Sleeve 55.9
Fashion 55.3

Clarifai

people 99.8
adult 99.3
woman 96
man 95.7
sit 95
monochrome 94
wear 93.1
group 92.2
child 91.8
sitting 90.8
chair 90.6
uniform 89.1
indoors 86.6
side view 86.2
two 85.4
outfit 83
face disguise 81.6
war 81.2
veil 81
furniture 80.7

Imagga

computer 41.6
laptop 35.5
man 34.9
people 30.7
working 28.3
person 27.7
business 27.3
office 26.3
work 25.9
male 24.8
adult 23.2
professional 23.1
home 22.3
technology 22.3
smiling 20.3
worker 18.7
businessman 18.5
clothing 18.2
robe 18
garment 17.8
sitting 17.2
indoors 16.7
corporate 16.3
businesswoman 14.5
happy 14.4
coat 14
desk 13.6
businesspeople 13.3
table 13.2
job 12.4
medical 12.4
smile 11.4
monitor 11.4
men 11.2
lifestyle 10.8
interior 10.6
wireless 10.5
research 10.5
portrait 10.4
keyboard 10.3
executive 10.3
lab coat 10.1
holding 9.9
newspaper 9.9
equipment 9.8
profession 9.6
house 9.2
information 8.9
notebook 8.8
looking 8.8
doctor 8.5
senior 8.4
communication 8.4
manager 8.4
horizontal 8.4
hand 8.4
indoor 8.2
team 8.1
product 8
negative 7.9
day 7.8
computers 7.8
face 7.8
lab 7.8
corporation 7.7
laboratory 7.7
hospital 7.7
modern 7.7
using 7.7
serious 7.6
workplace 7.6
adults 7.6
meeting 7.5
room 7.5
clothes 7.5
study 7.5
engineer 7.4
phone 7.4
suit 7.3
nurse 7.3
covering 7.3
group 7.3
kitchen 7.2

Microsoft

Face analysis

Amazon

AWS Rekognition

Age 26-44
Gender Female, 54.4%
Sad 1.3%
Surprised 0.1%
Calm 97.9%
Happy 0.1%
Confused 0.2%
Disgusted 0.1%
Angry 0.3%

AWS Rekognition

Age 23-38
Gender Female, 89%
Sad 69.4%
Calm 26.5%
Happy 0.9%
Angry 1%
Disgusted 0.7%
Surprised 0.7%
Confused 0.9%

AWS Rekognition

Age 20-38
Gender Female, 54.8%
Sad 47.1%
Disgusted 45.9%
Calm 45.6%
Surprised 46.7%
Confused 46.6%
Angry 45.7%
Happy 47.3%

AWS Rekognition

Age 14-25
Gender Female, 54.7%
Disgusted 45.3%
Calm 46.3%
Angry 46.4%
Sad 49.6%
Happy 46.1%
Confused 45.8%
Surprised 45.5%

AWS Rekognition

Age 26-43
Gender Female, 54.9%
Angry 45.4%
Surprised 45.8%
Disgusted 45.4%
Sad 45.6%
Calm 50.3%
Happy 47%
Confused 45.5%

Feature analysis

Amazon

Person 99.4%
Chair 84.7%

Captions

Microsoft

a group of people looking at a laptop 67.5%
a group of people sitting in front of a laptop 62.4%
a person sitting in front of a laptop 62.3%